Jan 26 18:35:43 localhost kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 26 18:35:43 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 26 18:35:43 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 18:35:43 localhost kernel: BIOS-provided physical RAM map:
Jan 26 18:35:43 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 26 18:35:43 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 26 18:35:43 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 26 18:35:43 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 26 18:35:43 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 26 18:35:43 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 26 18:35:43 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 26 18:35:43 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 26 18:35:43 localhost kernel: NX (Execute Disable) protection: active
Jan 26 18:35:43 localhost kernel: APIC: Static calls initialized
Jan 26 18:35:43 localhost kernel: SMBIOS 2.8 present.
Jan 26 18:35:43 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 26 18:35:43 localhost kernel: Hypervisor detected: KVM
Jan 26 18:35:43 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 26 18:35:43 localhost kernel: kvm-clock: using sched offset of 3507606342 cycles
Jan 26 18:35:43 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 26 18:35:43 localhost kernel: tsc: Detected 2799.998 MHz processor
Jan 26 18:35:43 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 26 18:35:43 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 26 18:35:43 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 26 18:35:43 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 26 18:35:43 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 26 18:35:43 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 26 18:35:43 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 26 18:35:43 localhost kernel: Using GB pages for direct mapping
Jan 26 18:35:43 localhost kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 26 18:35:43 localhost kernel: ACPI: Early table checksum verification disabled
Jan 26 18:35:43 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 26 18:35:43 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 18:35:43 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 18:35:43 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 18:35:43 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 26 18:35:43 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 18:35:43 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 26 18:35:43 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 26 18:35:43 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 26 18:35:43 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 26 18:35:43 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 26 18:35:43 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 26 18:35:43 localhost kernel: No NUMA configuration found
Jan 26 18:35:43 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 26 18:35:43 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 26 18:35:43 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 26 18:35:43 localhost kernel: Zone ranges:
Jan 26 18:35:43 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 26 18:35:43 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 26 18:35:43 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 26 18:35:43 localhost kernel:   Device   empty
Jan 26 18:35:43 localhost kernel: Movable zone start for each node
Jan 26 18:35:43 localhost kernel: Early memory node ranges
Jan 26 18:35:43 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 26 18:35:43 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 26 18:35:43 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 26 18:35:43 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 26 18:35:43 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 26 18:35:43 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 26 18:35:43 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 26 18:35:43 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 26 18:35:43 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 26 18:35:43 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 26 18:35:43 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 26 18:35:43 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 26 18:35:43 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 26 18:35:43 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 26 18:35:43 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 26 18:35:43 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 26 18:35:43 localhost kernel: TSC deadline timer available
Jan 26 18:35:43 localhost kernel: CPU topo: Max. logical packages:   8
Jan 26 18:35:43 localhost kernel: CPU topo: Max. logical dies:       8
Jan 26 18:35:43 localhost kernel: CPU topo: Max. dies per package:   1
Jan 26 18:35:43 localhost kernel: CPU topo: Max. threads per core:   1
Jan 26 18:35:43 localhost kernel: CPU topo: Num. cores per package:     1
Jan 26 18:35:43 localhost kernel: CPU topo: Num. threads per package:   1
Jan 26 18:35:43 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 26 18:35:43 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 26 18:35:43 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 26 18:35:43 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 26 18:35:43 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 26 18:35:43 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 26 18:35:43 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 26 18:35:43 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 26 18:35:43 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 26 18:35:43 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 26 18:35:43 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 26 18:35:43 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 26 18:35:43 localhost kernel: Booting paravirtualized kernel on KVM
Jan 26 18:35:43 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 26 18:35:43 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 26 18:35:43 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 26 18:35:43 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Jan 26 18:35:43 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Jan 26 18:35:43 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 26 18:35:43 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 18:35:43 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 26 18:35:43 localhost kernel: random: crng init done
Jan 26 18:35:43 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 26 18:35:43 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 26 18:35:43 localhost kernel: Fallback order for Node 0: 0 
Jan 26 18:35:43 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 26 18:35:43 localhost kernel: Policy zone: Normal
Jan 26 18:35:43 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 26 18:35:43 localhost kernel: software IO TLB: area num 8.
Jan 26 18:35:43 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 26 18:35:43 localhost kernel: ftrace: allocating 49417 entries in 194 pages
Jan 26 18:35:43 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 26 18:35:43 localhost kernel: Dynamic Preempt: voluntary
Jan 26 18:35:43 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 26 18:35:43 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 26 18:35:43 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 26 18:35:43 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 26 18:35:43 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 26 18:35:43 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 26 18:35:43 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 26 18:35:43 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 26 18:35:43 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 18:35:43 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 18:35:43 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 26 18:35:43 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 26 18:35:43 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 26 18:35:43 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 26 18:35:43 localhost kernel: Console: colour VGA+ 80x25
Jan 26 18:35:43 localhost kernel: printk: console [ttyS0] enabled
Jan 26 18:35:43 localhost kernel: ACPI: Core revision 20230331
Jan 26 18:35:43 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 26 18:35:43 localhost kernel: x2apic enabled
Jan 26 18:35:43 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 26 18:35:43 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 26 18:35:43 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Jan 26 18:35:43 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 26 18:35:43 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 26 18:35:43 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 26 18:35:43 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 26 18:35:43 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 26 18:35:43 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 26 18:35:43 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 26 18:35:43 localhost kernel: RETBleed: Mitigation: untrained return thunk
Jan 26 18:35:43 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 26 18:35:43 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 26 18:35:43 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 26 18:35:43 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 26 18:35:43 localhost kernel: x86/bugs: return thunk changed
Jan 26 18:35:43 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 26 18:35:43 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 26 18:35:43 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 26 18:35:43 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 26 18:35:43 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 26 18:35:43 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 26 18:35:43 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 26 18:35:43 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 26 18:35:43 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 26 18:35:43 localhost kernel: landlock: Up and running.
Jan 26 18:35:43 localhost kernel: Yama: becoming mindful.
Jan 26 18:35:43 localhost kernel: SELinux:  Initializing.
Jan 26 18:35:43 localhost kernel: LSM support for eBPF active
Jan 26 18:35:43 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 26 18:35:43 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 26 18:35:43 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 26 18:35:43 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 26 18:35:43 localhost kernel: ... version:                0
Jan 26 18:35:43 localhost kernel: ... bit width:              48
Jan 26 18:35:43 localhost kernel: ... generic registers:      6
Jan 26 18:35:43 localhost kernel: ... value mask:             0000ffffffffffff
Jan 26 18:35:43 localhost kernel: ... max period:             00007fffffffffff
Jan 26 18:35:43 localhost kernel: ... fixed-purpose events:   0
Jan 26 18:35:43 localhost kernel: ... event mask:             000000000000003f
Jan 26 18:35:43 localhost kernel: signal: max sigframe size: 1776
Jan 26 18:35:43 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 26 18:35:43 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 26 18:35:43 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 26 18:35:43 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 26 18:35:43 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 26 18:35:43 localhost kernel: smp: Brought up 1 node, 8 CPUs
Jan 26 18:35:43 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Jan 26 18:35:43 localhost kernel: node 0 deferred pages initialised in 9ms
Jan 26 18:35:43 localhost kernel: Memory: 7763796K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618356K reserved, 0K cma-reserved)
Jan 26 18:35:43 localhost kernel: devtmpfs: initialized
Jan 26 18:35:43 localhost kernel: x86/mm: Memory block size: 128MB
Jan 26 18:35:43 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 26 18:35:43 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 26 18:35:43 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 26 18:35:43 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 26 18:35:43 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 26 18:35:43 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 26 18:35:43 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 26 18:35:43 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 26 18:35:43 localhost kernel: audit: type=2000 audit(1769452541.652:1): state=initialized audit_enabled=0 res=1
Jan 26 18:35:43 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 26 18:35:43 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 26 18:35:43 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 26 18:35:43 localhost kernel: cpuidle: using governor menu
Jan 26 18:35:43 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 26 18:35:43 localhost kernel: PCI: Using configuration type 1 for base access
Jan 26 18:35:43 localhost kernel: PCI: Using configuration type 1 for extended access
Jan 26 18:35:43 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 26 18:35:43 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 26 18:35:43 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 26 18:35:43 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 26 18:35:43 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 26 18:35:43 localhost kernel: Demotion targets for Node 0: null
Jan 26 18:35:43 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 26 18:35:43 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 26 18:35:43 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 26 18:35:43 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 26 18:35:43 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 26 18:35:43 localhost kernel: ACPI: Interpreter enabled
Jan 26 18:35:43 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 26 18:35:43 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 26 18:35:43 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 26 18:35:43 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 26 18:35:43 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 26 18:35:43 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 26 18:35:43 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [3] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [4] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [5] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [6] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [7] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [8] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [9] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [10] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [11] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [12] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [13] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [14] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [15] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [16] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [17] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [18] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [19] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [20] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [21] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [22] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [23] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [24] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [25] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [26] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [27] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [28] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [29] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [30] registered
Jan 26 18:35:43 localhost kernel: acpiphp: Slot [31] registered
Jan 26 18:35:43 localhost kernel: PCI host bridge to bus 0000:00
Jan 26 18:35:43 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 26 18:35:43 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 26 18:35:43 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 26 18:35:43 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 26 18:35:43 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 26 18:35:43 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 26 18:35:43 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 26 18:35:43 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 26 18:35:43 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 26 18:35:43 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 26 18:35:43 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 26 18:35:43 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 26 18:35:43 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 26 18:35:43 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 26 18:35:43 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 26 18:35:43 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 26 18:35:43 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 26 18:35:43 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 26 18:35:43 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 26 18:35:43 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 26 18:35:43 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 26 18:35:43 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 26 18:35:43 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 26 18:35:43 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 26 18:35:43 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 26 18:35:43 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 26 18:35:43 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 26 18:35:43 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 26 18:35:43 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 26 18:35:43 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 26 18:35:43 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 26 18:35:43 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 26 18:35:43 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 26 18:35:43 localhost kernel: iommu: Default domain type: Translated
Jan 26 18:35:43 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 26 18:35:43 localhost kernel: SCSI subsystem initialized
Jan 26 18:35:43 localhost kernel: ACPI: bus type USB registered
Jan 26 18:35:43 localhost kernel: usbcore: registered new interface driver usbfs
Jan 26 18:35:43 localhost kernel: usbcore: registered new interface driver hub
Jan 26 18:35:43 localhost kernel: usbcore: registered new device driver usb
Jan 26 18:35:43 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 26 18:35:43 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 26 18:35:43 localhost kernel: PTP clock support registered
Jan 26 18:35:43 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 26 18:35:43 localhost kernel: NetLabel: Initializing
Jan 26 18:35:43 localhost kernel: NetLabel:  domain hash size = 128
Jan 26 18:35:43 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 26 18:35:43 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 26 18:35:43 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 26 18:35:43 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 26 18:35:43 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 26 18:35:43 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Jan 26 18:35:43 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 26 18:35:43 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 26 18:35:43 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 26 18:35:43 localhost kernel: vgaarb: loaded
Jan 26 18:35:43 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 26 18:35:43 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 26 18:35:43 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 26 18:35:43 localhost kernel: pnp: PnP ACPI init
Jan 26 18:35:43 localhost kernel: pnp 00:03: [dma 2]
Jan 26 18:35:43 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 26 18:35:43 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 26 18:35:43 localhost kernel: NET: Registered PF_INET protocol family
Jan 26 18:35:43 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 26 18:35:43 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 26 18:35:43 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 26 18:35:43 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 26 18:35:43 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 26 18:35:43 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 26 18:35:43 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 26 18:35:43 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 26 18:35:43 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 26 18:35:43 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 26 18:35:43 localhost kernel: NET: Registered PF_XDP protocol family
Jan 26 18:35:43 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 26 18:35:43 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 26 18:35:43 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 26 18:35:43 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 26 18:35:43 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 26 18:35:43 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 26 18:35:43 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 26 18:35:43 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 74778 usecs
Jan 26 18:35:43 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 26 18:35:43 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 26 18:35:43 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 26 18:35:43 localhost kernel: ACPI: bus type thunderbolt registered
Jan 26 18:35:43 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 26 18:35:43 localhost kernel: Initialise system trusted keyrings
Jan 26 18:35:43 localhost kernel: Key type blacklist registered
Jan 26 18:35:43 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 26 18:35:43 localhost kernel: zbud: loaded
Jan 26 18:35:43 localhost kernel: integrity: Platform Keyring initialized
Jan 26 18:35:43 localhost kernel: integrity: Machine keyring initialized
Jan 26 18:35:43 localhost kernel: Freeing initrd memory: 87956K
Jan 26 18:35:43 localhost kernel: NET: Registered PF_ALG protocol family
Jan 26 18:35:43 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 26 18:35:43 localhost kernel: Key type asymmetric registered
Jan 26 18:35:43 localhost kernel: Asymmetric key parser 'x509' registered
Jan 26 18:35:43 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 26 18:35:43 localhost kernel: io scheduler mq-deadline registered
Jan 26 18:35:43 localhost kernel: io scheduler kyber registered
Jan 26 18:35:43 localhost kernel: io scheduler bfq registered
Jan 26 18:35:43 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 26 18:35:43 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 26 18:35:43 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 26 18:35:43 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 26 18:35:43 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 26 18:35:43 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 26 18:35:43 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 26 18:35:43 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 26 18:35:43 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 26 18:35:43 localhost kernel: Non-volatile memory driver v1.3
Jan 26 18:35:43 localhost kernel: rdac: device handler registered
Jan 26 18:35:43 localhost kernel: hp_sw: device handler registered
Jan 26 18:35:43 localhost kernel: emc: device handler registered
Jan 26 18:35:43 localhost kernel: alua: device handler registered
Jan 26 18:35:43 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 26 18:35:43 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 26 18:35:43 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 26 18:35:43 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 26 18:35:43 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 26 18:35:43 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 26 18:35:43 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 26 18:35:43 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 26 18:35:43 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 26 18:35:43 localhost kernel: hub 1-0:1.0: USB hub found
Jan 26 18:35:43 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 26 18:35:43 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 26 18:35:43 localhost kernel: usbserial: USB Serial support registered for generic
Jan 26 18:35:43 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 26 18:35:43 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 26 18:35:43 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 26 18:35:43 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 26 18:35:43 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 26 18:35:43 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 26 18:35:43 localhost kernel: rtc_cmos 00:04: registered as rtc0
Jan 26 18:35:43 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-01-26T18:35:42 UTC (1769452542)
Jan 26 18:35:43 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 26 18:35:43 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 26 18:35:43 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 26 18:35:43 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 26 18:35:43 localhost kernel: usbcore: registered new interface driver usbhid
Jan 26 18:35:43 localhost kernel: usbhid: USB HID core driver
Jan 26 18:35:43 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 26 18:35:43 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 26 18:35:43 localhost kernel: Initializing XFRM netlink socket
Jan 26 18:35:43 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 26 18:35:43 localhost kernel: Segment Routing with IPv6
Jan 26 18:35:43 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 26 18:35:43 localhost kernel: mpls_gso: MPLS GSO support
Jan 26 18:35:43 localhost kernel: IPI shorthand broadcast: enabled
Jan 26 18:35:43 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 26 18:35:43 localhost kernel: AES CTR mode by8 optimization enabled
Jan 26 18:35:43 localhost kernel: sched_clock: Marking stable (1832004511, 160918425)->(2120658205, -127735269)
Jan 26 18:35:43 localhost kernel: registered taskstats version 1
Jan 26 18:35:43 localhost kernel: Loading compiled-in X.509 certificates
Jan 26 18:35:43 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 26 18:35:43 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 26 18:35:43 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 26 18:35:43 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 26 18:35:43 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 26 18:35:43 localhost kernel: Demotion targets for Node 0: null
Jan 26 18:35:43 localhost kernel: page_owner is disabled
Jan 26 18:35:43 localhost kernel: Key type .fscrypt registered
Jan 26 18:35:43 localhost kernel: Key type fscrypt-provisioning registered
Jan 26 18:35:43 localhost kernel: Key type big_key registered
Jan 26 18:35:43 localhost kernel: Key type encrypted registered
Jan 26 18:35:43 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 26 18:35:43 localhost kernel: Loading compiled-in module X.509 certificates
Jan 26 18:35:43 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 26 18:35:43 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 26 18:35:43 localhost kernel: ima: No architecture policies found
Jan 26 18:35:43 localhost kernel: evm: Initialising EVM extended attributes:
Jan 26 18:35:43 localhost kernel: evm: security.selinux
Jan 26 18:35:43 localhost kernel: evm: security.SMACK64 (disabled)
Jan 26 18:35:43 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 26 18:35:43 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 26 18:35:43 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 26 18:35:43 localhost kernel: evm: security.apparmor (disabled)
Jan 26 18:35:43 localhost kernel: evm: security.ima
Jan 26 18:35:43 localhost kernel: evm: security.capability
Jan 26 18:35:43 localhost kernel: evm: HMAC attrs: 0x1
Jan 26 18:35:43 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 26 18:35:43 localhost kernel: Running certificate verification RSA selftest
Jan 26 18:35:43 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 26 18:35:43 localhost kernel: Running certificate verification ECDSA selftest
Jan 26 18:35:43 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 26 18:35:43 localhost kernel: clk: Disabling unused clocks
Jan 26 18:35:43 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 26 18:35:43 localhost kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 26 18:35:43 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 26 18:35:43 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 26 18:35:43 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 26 18:35:43 localhost kernel: Run /init as init process
Jan 26 18:35:43 localhost kernel:   with arguments:
Jan 26 18:35:43 localhost kernel:     /init
Jan 26 18:35:43 localhost kernel:   with environment:
Jan 26 18:35:43 localhost kernel:     HOME=/
Jan 26 18:35:43 localhost kernel:     TERM=linux
Jan 26 18:35:43 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64
Jan 26 18:35:43 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 26 18:35:43 localhost systemd[1]: Detected virtualization kvm.
Jan 26 18:35:43 localhost systemd[1]: Detected architecture x86-64.
Jan 26 18:35:43 localhost systemd[1]: Running in initrd.
Jan 26 18:35:43 localhost systemd[1]: No hostname configured, using default hostname.
Jan 26 18:35:43 localhost systemd[1]: Hostname set to <localhost>.
Jan 26 18:35:43 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 26 18:35:43 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 26 18:35:43 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 26 18:35:43 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 26 18:35:43 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 26 18:35:43 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 26 18:35:43 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 26 18:35:43 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 26 18:35:43 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 26 18:35:43 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 26 18:35:43 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 26 18:35:43 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 26 18:35:43 localhost systemd[1]: Reached target Local File Systems.
Jan 26 18:35:43 localhost systemd[1]: Reached target Path Units.
Jan 26 18:35:43 localhost systemd[1]: Reached target Slice Units.
Jan 26 18:35:43 localhost systemd[1]: Reached target Swaps.
Jan 26 18:35:43 localhost systemd[1]: Reached target Timer Units.
Jan 26 18:35:43 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 26 18:35:43 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 26 18:35:43 localhost systemd[1]: Listening on Journal Socket.
Jan 26 18:35:43 localhost systemd[1]: Listening on udev Control Socket.
Jan 26 18:35:43 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 26 18:35:43 localhost systemd[1]: Reached target Socket Units.
Jan 26 18:35:43 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 26 18:35:43 localhost systemd[1]: Starting Journal Service...
Jan 26 18:35:43 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 26 18:35:43 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 26 18:35:43 localhost systemd[1]: Starting Create System Users...
Jan 26 18:35:43 localhost systemd[1]: Starting Setup Virtual Console...
Jan 26 18:35:43 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 26 18:35:43 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 26 18:35:43 localhost systemd[1]: Finished Create System Users.
Jan 26 18:35:43 localhost systemd-journald[304]: Journal started
Jan 26 18:35:43 localhost systemd-journald[304]: Runtime Journal (/run/log/journal/a7a0dc8c044040bb835e0c8b31a79067) is 8.0M, max 153.6M, 145.6M free.
Jan 26 18:35:43 localhost systemd-sysusers[309]: Creating group 'users' with GID 100.
Jan 26 18:35:43 localhost systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Jan 26 18:35:43 localhost systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 26 18:35:43 localhost systemd[1]: Started Journal Service.
Jan 26 18:35:43 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 26 18:35:43 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 26 18:35:43 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 26 18:35:43 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 26 18:35:43 localhost systemd[1]: Finished Setup Virtual Console.
Jan 26 18:35:43 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 26 18:35:43 localhost systemd[1]: Starting dracut cmdline hook...
Jan 26 18:35:43 localhost dracut-cmdline[323]: dracut-9 dracut-057-102.git20250818.el9
Jan 26 18:35:43 localhost dracut-cmdline[323]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 26 18:35:43 localhost systemd[1]: Finished dracut cmdline hook.
Jan 26 18:35:43 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 26 18:35:43 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 26 18:35:43 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 26 18:35:43 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 26 18:35:43 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 26 18:35:43 localhost kernel: RPC: Registered udp transport module.
Jan 26 18:35:43 localhost kernel: RPC: Registered tcp transport module.
Jan 26 18:35:43 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 26 18:35:43 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 26 18:35:43 localhost rpc.statd[441]: Version 2.5.4 starting
Jan 26 18:35:43 localhost rpc.statd[441]: Initializing NSM state
Jan 26 18:35:43 localhost rpc.idmapd[446]: Setting log level to 0
Jan 26 18:35:43 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 26 18:35:43 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 26 18:35:43 localhost systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Jan 26 18:35:43 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 26 18:35:43 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 26 18:35:43 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 26 18:35:43 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 26 18:35:43 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 26 18:35:43 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 26 18:35:43 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 26 18:35:43 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 18:35:43 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 26 18:35:43 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 26 18:35:43 localhost systemd[1]: Reached target Network.
Jan 26 18:35:43 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 26 18:35:43 localhost systemd[1]: Starting dracut initqueue hook...
Jan 26 18:35:43 localhost kernel: libata version 3.00 loaded.
Jan 26 18:35:43 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Jan 26 18:35:43 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 26 18:35:43 localhost kernel: scsi host0: ata_piix
Jan 26 18:35:43 localhost kernel: scsi host1: ata_piix
Jan 26 18:35:43 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 26 18:35:43 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 26 18:35:44 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 26 18:35:44 localhost systemd-udevd[461]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 18:35:44 localhost kernel:  vda: vda1
Jan 26 18:35:44 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 26 18:35:44 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 26 18:35:44 localhost systemd[1]: Reached target System Initialization.
Jan 26 18:35:44 localhost systemd[1]: Reached target Basic System.
Jan 26 18:35:44 localhost kernel: ata1: found unknown device (class 0)
Jan 26 18:35:44 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 26 18:35:44 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 26 18:35:44 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 26 18:35:44 localhost systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 26 18:35:44 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 26 18:35:44 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 26 18:35:44 localhost systemd[1]: Reached target Initrd Root Device.
Jan 26 18:35:44 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 26 18:35:44 localhost systemd[1]: Finished dracut initqueue hook.
Jan 26 18:35:44 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 26 18:35:44 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 26 18:35:44 localhost systemd[1]: Reached target Remote File Systems.
Jan 26 18:35:44 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 26 18:35:44 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 26 18:35:44 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 26 18:35:44 localhost systemd-fsck[552]: /usr/sbin/fsck.xfs: XFS file system.
Jan 26 18:35:44 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 26 18:35:44 localhost systemd[1]: Mounting /sysroot...
Jan 26 18:35:44 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 26 18:35:44 localhost kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 26 18:35:44 localhost kernel: XFS (vda1): Ending clean mount
Jan 26 18:35:44 localhost systemd[1]: Mounted /sysroot.
Jan 26 18:35:44 localhost systemd[1]: Reached target Initrd Root File System.
Jan 26 18:35:44 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 26 18:35:45 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 26 18:35:45 localhost systemd[1]: Reached target Initrd File Systems.
Jan 26 18:35:45 localhost systemd[1]: Reached target Initrd Default Target.
Jan 26 18:35:45 localhost systemd[1]: Starting dracut mount hook...
Jan 26 18:35:45 localhost systemd[1]: Finished dracut mount hook.
Jan 26 18:35:45 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 26 18:35:45 localhost rpc.idmapd[446]: exiting on signal 15
Jan 26 18:35:45 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 26 18:35:45 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 26 18:35:45 localhost systemd[1]: Stopped target Network.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Timer Units.
Jan 26 18:35:45 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 26 18:35:45 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Basic System.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Path Units.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Remote File Systems.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Slice Units.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Socket Units.
Jan 26 18:35:45 localhost systemd[1]: Stopped target System Initialization.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Local File Systems.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Swaps.
Jan 26 18:35:45 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped dracut mount hook.
Jan 26 18:35:45 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 26 18:35:45 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 26 18:35:45 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 26 18:35:45 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 26 18:35:45 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 26 18:35:45 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 26 18:35:45 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 26 18:35:45 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 26 18:35:45 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 26 18:35:45 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 26 18:35:45 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 26 18:35:45 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 26 18:35:45 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Closed udev Control Socket.
Jan 26 18:35:45 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Closed udev Kernel Socket.
Jan 26 18:35:45 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 26 18:35:45 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 26 18:35:45 localhost systemd[1]: Starting Cleanup udev Database...
Jan 26 18:35:45 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 26 18:35:45 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 26 18:35:45 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Stopped Create System Users.
Jan 26 18:35:45 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 26 18:35:45 localhost systemd[1]: Finished Cleanup udev Database.
Jan 26 18:35:45 localhost systemd[1]: Reached target Switch Root.
Jan 26 18:35:45 localhost systemd[1]: Starting Switch Root...
Jan 26 18:35:45 localhost systemd[1]: Switching root.
Jan 26 18:35:45 localhost systemd-journald[304]: Journal stopped
Jan 26 18:35:46 localhost systemd-journald[304]: Received SIGTERM from PID 1 (systemd).
Jan 26 18:35:46 localhost kernel: audit: type=1404 audit(1769452545.480:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 26 18:35:46 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 18:35:46 localhost kernel: SELinux:  policy capability open_perms=1
Jan 26 18:35:46 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 18:35:46 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 26 18:35:46 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 18:35:46 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 18:35:46 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 18:35:46 localhost kernel: audit: type=1403 audit(1769452545.600:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 26 18:35:46 localhost systemd[1]: Successfully loaded SELinux policy in 123.697ms.
Jan 26 18:35:46 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.352ms.
Jan 26 18:35:46 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 26 18:35:46 localhost systemd[1]: Detected virtualization kvm.
Jan 26 18:35:46 localhost systemd[1]: Detected architecture x86-64.
Jan 26 18:35:46 localhost systemd-rc-local-generator[634]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 18:35:46 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 26 18:35:46 localhost systemd[1]: Stopped Switch Root.
Jan 26 18:35:46 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 26 18:35:46 localhost systemd[1]: Created slice Slice /system/getty.
Jan 26 18:35:46 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 26 18:35:46 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 26 18:35:46 localhost systemd[1]: Created slice User and Session Slice.
Jan 26 18:35:46 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 26 18:35:46 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 26 18:35:46 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 26 18:35:46 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 26 18:35:46 localhost systemd[1]: Stopped target Switch Root.
Jan 26 18:35:46 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 26 18:35:46 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 26 18:35:46 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 26 18:35:46 localhost systemd[1]: Reached target Path Units.
Jan 26 18:35:46 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 26 18:35:46 localhost systemd[1]: Reached target Slice Units.
Jan 26 18:35:46 localhost systemd[1]: Reached target Swaps.
Jan 26 18:35:46 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 26 18:35:46 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 26 18:35:46 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 26 18:35:46 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 26 18:35:46 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 26 18:35:46 localhost systemd[1]: Listening on udev Control Socket.
Jan 26 18:35:46 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 26 18:35:46 localhost systemd[1]: Mounting Huge Pages File System...
Jan 26 18:35:46 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 26 18:35:46 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 26 18:35:46 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 26 18:35:46 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 18:35:46 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 26 18:35:46 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 26 18:35:46 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 26 18:35:46 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 26 18:35:46 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 26 18:35:46 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 26 18:35:46 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 26 18:35:46 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 26 18:35:46 localhost systemd[1]: Stopped Journal Service.
Jan 26 18:35:46 localhost systemd[1]: Starting Journal Service...
Jan 26 18:35:46 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 26 18:35:46 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 26 18:35:46 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 18:35:46 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 26 18:35:46 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 26 18:35:46 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 26 18:35:46 localhost kernel: fuse: init (API version 7.37)
Jan 26 18:35:46 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 26 18:35:46 localhost systemd-journald[676]: Journal started
Jan 26 18:35:46 localhost systemd-journald[676]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 26 18:35:45 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 26 18:35:45 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 26 18:35:46 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 26 18:35:46 localhost systemd[1]: Started Journal Service.
Jan 26 18:35:46 localhost systemd[1]: Mounted Huge Pages File System.
Jan 26 18:35:46 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 26 18:35:46 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 26 18:35:46 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 26 18:35:46 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 26 18:35:46 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 18:35:46 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 26 18:35:46 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 26 18:35:46 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 26 18:35:46 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 26 18:35:46 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 26 18:35:46 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 26 18:35:46 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 26 18:35:46 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 26 18:35:46 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 26 18:35:46 localhost systemd[1]: Mounting FUSE Control File System...
Jan 26 18:35:46 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 26 18:35:46 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 26 18:35:46 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 26 18:35:46 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 26 18:35:46 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 26 18:35:46 localhost systemd[1]: Starting Create System Users...
Jan 26 18:35:46 localhost systemd-journald[676]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 26 18:35:46 localhost systemd-journald[676]: Received client request to flush runtime journal.
Jan 26 18:35:46 localhost systemd[1]: Mounted FUSE Control File System.
Jan 26 18:35:46 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 26 18:35:46 localhost kernel: ACPI: bus type drm_connector registered
Jan 26 18:35:46 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 26 18:35:46 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 26 18:35:46 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 26 18:35:46 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 26 18:35:46 localhost systemd[1]: Finished Create System Users.
Jan 26 18:35:46 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 26 18:35:46 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 26 18:35:46 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 26 18:35:46 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 26 18:35:46 localhost systemd[1]: Reached target Local File Systems.
Jan 26 18:35:46 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 26 18:35:46 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 26 18:35:46 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 26 18:35:46 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 26 18:35:46 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 26 18:35:46 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 26 18:35:46 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 26 18:35:46 localhost bootctl[694]: Couldn't find EFI system partition, skipping.
Jan 26 18:35:46 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 26 18:35:46 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 26 18:35:46 localhost systemd[1]: Starting Security Auditing Service...
Jan 26 18:35:46 localhost systemd[1]: Starting RPC Bind...
Jan 26 18:35:46 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 26 18:35:46 localhost auditd[700]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 26 18:35:46 localhost auditd[700]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 26 18:35:46 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 26 18:35:46 localhost systemd[1]: Started RPC Bind.
Jan 26 18:35:46 localhost augenrules[705]: /sbin/augenrules: No change
Jan 26 18:35:46 localhost augenrules[720]: No rules
Jan 26 18:35:46 localhost augenrules[720]: enabled 1
Jan 26 18:35:46 localhost augenrules[720]: failure 1
Jan 26 18:35:46 localhost augenrules[720]: pid 700
Jan 26 18:35:46 localhost augenrules[720]: rate_limit 0
Jan 26 18:35:46 localhost augenrules[720]: backlog_limit 8192
Jan 26 18:35:46 localhost augenrules[720]: lost 0
Jan 26 18:35:46 localhost augenrules[720]: backlog 3
Jan 26 18:35:46 localhost augenrules[720]: backlog_wait_time 60000
Jan 26 18:35:46 localhost augenrules[720]: backlog_wait_time_actual 0
Jan 26 18:35:46 localhost augenrules[720]: enabled 1
Jan 26 18:35:46 localhost augenrules[720]: failure 1
Jan 26 18:35:46 localhost augenrules[720]: pid 700
Jan 26 18:35:46 localhost augenrules[720]: rate_limit 0
Jan 26 18:35:46 localhost augenrules[720]: backlog_limit 8192
Jan 26 18:35:46 localhost augenrules[720]: lost 0
Jan 26 18:35:46 localhost augenrules[720]: backlog 0
Jan 26 18:35:46 localhost augenrules[720]: backlog_wait_time 60000
Jan 26 18:35:46 localhost augenrules[720]: backlog_wait_time_actual 0
Jan 26 18:35:46 localhost augenrules[720]: enabled 1
Jan 26 18:35:46 localhost augenrules[720]: failure 1
Jan 26 18:35:46 localhost augenrules[720]: pid 700
Jan 26 18:35:46 localhost augenrules[720]: rate_limit 0
Jan 26 18:35:46 localhost augenrules[720]: backlog_limit 8192
Jan 26 18:35:46 localhost augenrules[720]: lost 0
Jan 26 18:35:46 localhost augenrules[720]: backlog 0
Jan 26 18:35:46 localhost augenrules[720]: backlog_wait_time 60000
Jan 26 18:35:46 localhost augenrules[720]: backlog_wait_time_actual 0
Jan 26 18:35:46 localhost systemd[1]: Started Security Auditing Service.
Jan 26 18:35:46 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 26 18:35:46 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 26 18:35:46 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 26 18:35:46 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 26 18:35:46 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 26 18:35:46 localhost systemd[1]: Starting Update is Completed...
Jan 26 18:35:46 localhost systemd[1]: Finished Update is Completed.
Jan 26 18:35:46 localhost systemd-udevd[728]: Using default interface naming scheme 'rhel-9.0'.
Jan 26 18:35:46 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 26 18:35:46 localhost systemd[1]: Reached target System Initialization.
Jan 26 18:35:46 localhost systemd[1]: Started dnf makecache --timer.
Jan 26 18:35:46 localhost systemd[1]: Started Daily rotation of log files.
Jan 26 18:35:46 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 26 18:35:46 localhost systemd[1]: Reached target Timer Units.
Jan 26 18:35:46 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 26 18:35:46 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 26 18:35:46 localhost systemd[1]: Reached target Socket Units.
Jan 26 18:35:46 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 26 18:35:46 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 18:35:46 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 26 18:35:46 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 26 18:35:46 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 26 18:35:46 localhost systemd-udevd[750]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 18:35:46 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 26 18:35:46 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 26 18:35:46 localhost systemd[1]: Reached target Basic System.
Jan 26 18:35:46 localhost dbus-broker-lau[754]: Ready
Jan 26 18:35:46 localhost systemd[1]: Starting NTP client/server...
Jan 26 18:35:46 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 26 18:35:46 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 26 18:35:47 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 26 18:35:47 localhost chronyd[783]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 26 18:35:47 localhost chronyd[783]: Loaded 0 symmetric keys
Jan 26 18:35:47 localhost chronyd[783]: Using right/UTC timezone to obtain leap second data
Jan 26 18:35:47 localhost chronyd[783]: Loaded seccomp filter (level 2)
Jan 26 18:35:47 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 26 18:35:47 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 26 18:35:47 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 26 18:35:47 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 26 18:35:47 localhost systemd[1]: Started irqbalance daemon.
Jan 26 18:35:47 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 26 18:35:47 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 18:35:47 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 18:35:47 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 18:35:47 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 26 18:35:47 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 26 18:35:47 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 26 18:35:47 localhost systemd[1]: Starting User Login Management...
Jan 26 18:35:47 localhost systemd[1]: Started NTP client/server.
Jan 26 18:35:47 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 26 18:35:47 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 26 18:35:47 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 26 18:35:47 localhost systemd-logind[794]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 26 18:35:47 localhost systemd-logind[794]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 26 18:35:47 localhost kernel: kvm_amd: TSC scaling supported
Jan 26 18:35:47 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 26 18:35:47 localhost kernel: kvm_amd: Nested Paging enabled
Jan 26 18:35:47 localhost kernel: kvm_amd: LBR virtualization supported
Jan 26 18:35:47 localhost kernel: Console: switching to colour dummy device 80x25
Jan 26 18:35:47 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 26 18:35:47 localhost kernel: [drm] features: -context_init
Jan 26 18:35:47 localhost systemd-logind[794]: New seat seat0.
Jan 26 18:35:47 localhost systemd[1]: Started User Login Management.
Jan 26 18:35:47 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 26 18:35:47 localhost kernel: [drm] number of scanouts: 1
Jan 26 18:35:47 localhost kernel: [drm] number of cap sets: 0
Jan 26 18:35:47 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 26 18:35:47 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 26 18:35:47 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 26 18:35:47 localhost kernel: Console: switching to colour frame buffer device 128x48
Jan 26 18:35:47 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 26 18:35:47 localhost iptables.init[779]: iptables: Applying firewall rules: [  OK  ]
Jan 26 18:35:47 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 26 18:35:47 localhost cloud-init[836]: Cloud-init v. 24.4-8.el9 running 'init-local' at Mon, 26 Jan 2026 18:35:47 +0000. Up 6.74 seconds.
Jan 26 18:35:47 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 26 18:35:47 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 26 18:35:47 localhost systemd[1]: run-cloud\x2dinit-tmp-tmptzduhqv4.mount: Deactivated successfully.
Jan 26 18:35:47 localhost systemd[1]: Starting Hostname Service...
Jan 26 18:35:47 localhost systemd[1]: Started Hostname Service.
Jan 26 18:35:47 np0005596227.novalocal systemd-hostnamed[850]: Hostname set to <np0005596227.novalocal> (static)
Jan 26 18:35:47 np0005596227.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 26 18:35:47 np0005596227.novalocal systemd[1]: Reached target Preparation for Network.
Jan 26 18:35:47 np0005596227.novalocal systemd[1]: Starting Network Manager...
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0037] NetworkManager (version 1.54.3-2.el9) is starting... (boot:482de948-a14a-4a06-a160-ce1b2a745f1c)
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0043] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0123] manager[0x55e3343ad000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0214] hostname: hostname: using hostnamed
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0215] hostname: static hostname changed from (none) to "np0005596227.novalocal"
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0219] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0345] manager[0x55e3343ad000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0345] manager[0x55e3343ad000]: rfkill: WWAN hardware radio set enabled
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0400] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0401] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0402] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0403] manager: Networking is enabled by state file
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0408] settings: Loaded settings plugin: keyfile (internal)
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0420] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0448] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0463] dhcp: init: Using DHCP client 'internal'
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0469] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0488] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0500] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0514] device (lo): Activation: starting connection 'lo' (4396d3e2-241a-4088-b481-db553b6a2730)
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0527] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0532] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0575] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0581] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0584] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0586] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0588] device (eth0): carrier: link connected
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0592] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0600] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0606] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0612] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0613] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0616] manager: NetworkManager state is now CONNECTING
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0617] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0624] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0628] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0678] dhcp4 (eth0): state changed new lease, address=38.102.83.58
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: Started Network Manager.
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0691] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: Reached target Network.
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0717] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0823] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0829] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0831] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0838] device (lo): Activation: successful, device activated.
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0844] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0847] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0848] device (eth0): Activation: successful, device activated.
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0855] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 18:35:48 np0005596227.novalocal NetworkManager[854]: <info>  [1769452548.0858] manager: startup complete
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: Reached target NFS client services.
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: Reached target Remote File Systems.
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 26 18:35:48 np0005596227.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: Cloud-init v. 24.4-8.el9 running 'init' at Mon, 26 Jan 2026 18:35:48 +0000. Up 7.70 seconds.
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: |  eth0  | True |         38.102.83.58         | 255.255.255.0 | global | fa:16:3e:25:de:67 |
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: |  eth0  | True | fe80::f816:3eff:fe25:de67/64 |       .       |  link  | fa:16:3e:25:de:67 |
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 26 18:35:48 np0005596227.novalocal cloud-init[917]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 26 18:35:49 np0005596227.novalocal useradd[983]: new group: name=cloud-user, GID=1001
Jan 26 18:35:49 np0005596227.novalocal useradd[983]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 26 18:35:49 np0005596227.novalocal useradd[983]: add 'cloud-user' to group 'adm'
Jan 26 18:35:49 np0005596227.novalocal useradd[983]: add 'cloud-user' to group 'systemd-journal'
Jan 26 18:35:49 np0005596227.novalocal useradd[983]: add 'cloud-user' to shadow group 'adm'
Jan 26 18:35:49 np0005596227.novalocal useradd[983]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: Generating public/private rsa key pair.
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: The key fingerprint is:
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: SHA256:G6lPA0G3qdJmN4ef4eTis1PTDiDaeNmVB2JJ4dC32ys root@np0005596227.novalocal
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: The key's randomart image is:
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: +---[RSA 3072]----+
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |      oo+o       |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |     . +=oo      |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |      ..+o +     |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |     ..o.o+ .    |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |    .+*+So+=     |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |    o+++.X=oo    |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |     .. =.=+ .   |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |       +ooE o    |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |        ++ .     |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: +----[SHA256]-----+
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: Generating public/private ecdsa key pair.
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: The key fingerprint is:
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: SHA256:ZNYUNHrd1N/UUfZhMOEx4VL72PIp7dBwyswWPSG9vQI root@np0005596227.novalocal
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: The key's randomart image is:
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: +---[ECDSA 256]---+
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |         .=. O==*|
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |         + o+oB.*|
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |        = o..=.++|
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |       + .  . * *|
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |        S  E = B.|
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |            = X +|
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |             X * |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |            . =  |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |               . |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: +----[SHA256]-----+
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: Generating public/private ed25519 key pair.
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: The key fingerprint is:
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: SHA256:rgnzGQ0flZdNmDu+YBqcBCcg9E+H8KQmM0GUgmXNm0w root@np0005596227.novalocal
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: The key's randomart image is:
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: +--[ED25519 256]--+
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |.=O+o..      o.  |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |o..+E=o..  .o+   |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: | .+o+o++. o o..  |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |   =+o ... .o    |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |      ooS. . .   |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |       =+.o .    |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |    o . ++ . .   |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |     + =.   .    |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: |      =          |
Jan 26 18:35:49 np0005596227.novalocal cloud-init[917]: +----[SHA256]-----+
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Reached target Cloud-config availability.
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Reached target Network is Online.
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Starting Crash recovery kernel arming...
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Starting System Logging Service...
Jan 26 18:35:49 np0005596227.novalocal sm-notify[1000]: Version 2.5.4 starting
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Starting OpenSSH server daemon...
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Starting Permit User Sessions...
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Started Notify NFS peers of a restart.
Jan 26 18:35:49 np0005596227.novalocal sshd[1002]: Server listening on 0.0.0.0 port 22.
Jan 26 18:35:49 np0005596227.novalocal sshd[1002]: Server listening on :: port 22.
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Started OpenSSH server daemon.
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Finished Permit User Sessions.
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Started Command Scheduler.
Jan 26 18:35:49 np0005596227.novalocal rsyslogd[1001]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1001" x-info="https://www.rsyslog.com"] start
Jan 26 18:35:49 np0005596227.novalocal rsyslogd[1001]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Started Getty on tty1.
Jan 26 18:35:49 np0005596227.novalocal crond[1005]: (CRON) STARTUP (1.5.7)
Jan 26 18:35:49 np0005596227.novalocal crond[1005]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 26 18:35:49 np0005596227.novalocal crond[1005]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 71% if used.)
Jan 26 18:35:49 np0005596227.novalocal crond[1005]: (CRON) INFO (running with inotify support)
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Started Serial Getty on ttyS0.
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Reached target Login Prompts.
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Started System Logging Service.
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Reached target Multi-User System.
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 26 18:35:49 np0005596227.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 26 18:35:49 np0005596227.novalocal rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 18:35:49 np0005596227.novalocal kdumpctl[1011]: kdump: No kdump initial ramdisk found.
Jan 26 18:35:49 np0005596227.novalocal kdumpctl[1011]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 26 18:35:50 np0005596227.novalocal cloud-init[1123]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Mon, 26 Jan 2026 18:35:50 +0000. Up 9.29 seconds.
Jan 26 18:35:50 np0005596227.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Jan 26 18:35:50 np0005596227.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Jan 26 18:35:50 np0005596227.novalocal dracut[1263]: dracut-057-102.git20250818.el9
Jan 26 18:35:50 np0005596227.novalocal cloud-init[1280]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Mon, 26 Jan 2026 18:35:50 +0000. Up 9.68 seconds.
Jan 26 18:35:50 np0005596227.novalocal cloud-init[1282]: #############################################################
Jan 26 18:35:50 np0005596227.novalocal cloud-init[1283]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 26 18:35:50 np0005596227.novalocal cloud-init[1285]: 256 SHA256:ZNYUNHrd1N/UUfZhMOEx4VL72PIp7dBwyswWPSG9vQI root@np0005596227.novalocal (ECDSA)
Jan 26 18:35:50 np0005596227.novalocal cloud-init[1287]: 256 SHA256:rgnzGQ0flZdNmDu+YBqcBCcg9E+H8KQmM0GUgmXNm0w root@np0005596227.novalocal (ED25519)
Jan 26 18:35:50 np0005596227.novalocal cloud-init[1289]: 3072 SHA256:G6lPA0G3qdJmN4ef4eTis1PTDiDaeNmVB2JJ4dC32ys root@np0005596227.novalocal (RSA)
Jan 26 18:35:50 np0005596227.novalocal cloud-init[1292]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 26 18:35:50 np0005596227.novalocal cloud-init[1293]: #############################################################
Jan 26 18:35:50 np0005596227.novalocal dracut[1266]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 26 18:35:50 np0005596227.novalocal cloud-init[1280]: Cloud-init v. 24.4-8.el9 finished at Mon, 26 Jan 2026 18:35:50 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.89 seconds
Jan 26 18:35:50 np0005596227.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Jan 26 18:35:50 np0005596227.novalocal systemd[1]: Reached target Cloud-init target.
Jan 26 18:35:50 np0005596227.novalocal sshd-session[1347]: Connection reset by 38.102.83.114 port 39052 [preauth]
Jan 26 18:35:50 np0005596227.novalocal sshd-session[1352]: Unable to negotiate with 38.102.83.114 port 39064: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 26 18:35:50 np0005596227.novalocal sshd-session[1356]: Connection reset by 38.102.83.114 port 39072 [preauth]
Jan 26 18:35:50 np0005596227.novalocal sshd-session[1360]: Unable to negotiate with 38.102.83.114 port 39076: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 26 18:35:50 np0005596227.novalocal sshd-session[1367]: Unable to negotiate with 38.102.83.114 port 39086: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 26 18:35:50 np0005596227.novalocal sshd-session[1372]: Connection reset by 38.102.83.114 port 39088 [preauth]
Jan 26 18:35:50 np0005596227.novalocal sshd-session[1377]: Connection reset by 38.102.83.114 port 39096 [preauth]
Jan 26 18:35:50 np0005596227.novalocal sshd-session[1394]: Unable to negotiate with 38.102.83.114 port 39106: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 26 18:35:50 np0005596227.novalocal sshd-session[1396]: Unable to negotiate with 38.102.83.114 port 39108: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: memstrack is not available
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: memstrack is not available
Jan 26 18:35:51 np0005596227.novalocal dracut[1266]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 26 18:35:52 np0005596227.novalocal dracut[1266]: *** Including module: systemd ***
Jan 26 18:35:52 np0005596227.novalocal dracut[1266]: *** Including module: fips ***
Jan 26 18:35:52 np0005596227.novalocal dracut[1266]: *** Including module: systemd-initrd ***
Jan 26 18:35:52 np0005596227.novalocal dracut[1266]: *** Including module: i18n ***
Jan 26 18:35:52 np0005596227.novalocal dracut[1266]: *** Including module: drm ***
Jan 26 18:35:52 np0005596227.novalocal chronyd[783]: Selected source 206.108.0.132 (2.centos.pool.ntp.org)
Jan 26 18:35:52 np0005596227.novalocal chronyd[783]: System clock TAI offset set to 37 seconds
Jan 26 18:35:53 np0005596227.novalocal dracut[1266]: *** Including module: prefixdevname ***
Jan 26 18:35:53 np0005596227.novalocal dracut[1266]: *** Including module: kernel-modules ***
Jan 26 18:35:53 np0005596227.novalocal kernel: block vda: the capability attribute has been deprecated.
Jan 26 18:35:53 np0005596227.novalocal dracut[1266]: *** Including module: kernel-modules-extra ***
Jan 26 18:35:53 np0005596227.novalocal dracut[1266]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 26 18:35:53 np0005596227.novalocal dracut[1266]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 26 18:35:53 np0005596227.novalocal dracut[1266]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 26 18:35:53 np0005596227.novalocal dracut[1266]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 26 18:35:53 np0005596227.novalocal dracut[1266]: *** Including module: qemu ***
Jan 26 18:35:54 np0005596227.novalocal dracut[1266]: *** Including module: fstab-sys ***
Jan 26 18:35:54 np0005596227.novalocal dracut[1266]: *** Including module: rootfs-block ***
Jan 26 18:35:54 np0005596227.novalocal dracut[1266]: *** Including module: terminfo ***
Jan 26 18:35:54 np0005596227.novalocal dracut[1266]: *** Including module: udev-rules ***
Jan 26 18:35:54 np0005596227.novalocal dracut[1266]: Skipping udev rule: 91-permissions.rules
Jan 26 18:35:54 np0005596227.novalocal dracut[1266]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 26 18:35:54 np0005596227.novalocal dracut[1266]: *** Including module: virtiofs ***
Jan 26 18:35:54 np0005596227.novalocal dracut[1266]: *** Including module: dracut-systemd ***
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]: *** Including module: usrmount ***
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]: *** Including module: base ***
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]: *** Including module: fs-lib ***
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]: *** Including module: kdumpbase ***
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:   microcode_ctl module: mangling fw_dir
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: configuration "intel" is ignored
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 26 18:35:55 np0005596227.novalocal dracut[1266]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 26 18:35:56 np0005596227.novalocal dracut[1266]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 26 18:35:56 np0005596227.novalocal dracut[1266]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 26 18:35:56 np0005596227.novalocal dracut[1266]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 26 18:35:56 np0005596227.novalocal dracut[1266]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 26 18:35:56 np0005596227.novalocal dracut[1266]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 26 18:35:56 np0005596227.novalocal dracut[1266]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 26 18:35:56 np0005596227.novalocal dracut[1266]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 26 18:35:56 np0005596227.novalocal dracut[1266]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 26 18:35:56 np0005596227.novalocal dracut[1266]: *** Including module: openssl ***
Jan 26 18:35:56 np0005596227.novalocal dracut[1266]: *** Including module: shutdown ***
Jan 26 18:35:56 np0005596227.novalocal dracut[1266]: *** Including module: squash ***
Jan 26 18:35:56 np0005596227.novalocal dracut[1266]: *** Including modules done ***
Jan 26 18:35:56 np0005596227.novalocal dracut[1266]: *** Installing kernel module dependencies ***
Jan 26 18:35:57 np0005596227.novalocal dracut[1266]: *** Installing kernel module dependencies done ***
Jan 26 18:35:57 np0005596227.novalocal dracut[1266]: *** Resolving executable dependencies ***
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: Cannot change IRQ 35 affinity: Operation not permitted
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: IRQ 35 affinity is now unmanaged
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: Cannot change IRQ 33 affinity: Operation not permitted
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: IRQ 33 affinity is now unmanaged
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: IRQ 31 affinity is now unmanaged
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: IRQ 28 affinity is now unmanaged
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: Cannot change IRQ 34 affinity: Operation not permitted
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: IRQ 34 affinity is now unmanaged
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: IRQ 32 affinity is now unmanaged
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: IRQ 30 affinity is now unmanaged
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 26 18:35:57 np0005596227.novalocal irqbalance[790]: IRQ 29 affinity is now unmanaged
Jan 26 18:35:58 np0005596227.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 18:35:59 np0005596227.novalocal dracut[1266]: *** Resolving executable dependencies done ***
Jan 26 18:35:59 np0005596227.novalocal dracut[1266]: *** Generating early-microcode cpio image ***
Jan 26 18:35:59 np0005596227.novalocal dracut[1266]: *** Store current command line parameters ***
Jan 26 18:35:59 np0005596227.novalocal dracut[1266]: Stored kernel commandline:
Jan 26 18:35:59 np0005596227.novalocal dracut[1266]: No dracut internal kernel commandline stored in the initramfs
Jan 26 18:36:18 np0005596227.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 18:37:02 np0005596227.novalocal dracut[1266]: *** Install squash loader ***
Jan 26 18:37:03 np0005596227.novalocal dracut[1266]: *** Squashing the files inside the initramfs ***
Jan 26 18:37:04 np0005596227.novalocal dracut[1266]: *** Squashing the files inside the initramfs done ***
Jan 26 18:37:04 np0005596227.novalocal dracut[1266]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 26 18:37:04 np0005596227.novalocal dracut[1266]: *** Hardlinking files ***
Jan 26 18:37:04 np0005596227.novalocal dracut[1266]: Mode:           real
Jan 26 18:37:04 np0005596227.novalocal dracut[1266]: Files:          50
Jan 26 18:37:04 np0005596227.novalocal dracut[1266]: Linked:         0 files
Jan 26 18:37:04 np0005596227.novalocal dracut[1266]: Compared:       0 xattrs
Jan 26 18:37:04 np0005596227.novalocal dracut[1266]: Compared:       0 files
Jan 26 18:37:04 np0005596227.novalocal dracut[1266]: Saved:          0 B
Jan 26 18:37:04 np0005596227.novalocal dracut[1266]: Duration:       0.000568 seconds
Jan 26 18:37:04 np0005596227.novalocal dracut[1266]: *** Hardlinking files done ***
Jan 26 18:37:04 np0005596227.novalocal dracut[1266]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 26 18:37:05 np0005596227.novalocal kdumpctl[1011]: kdump: kexec: loaded kdump kernel
Jan 26 18:37:05 np0005596227.novalocal kdumpctl[1011]: kdump: Starting kdump: [OK]
Jan 26 18:37:05 np0005596227.novalocal systemd[1]: Finished Crash recovery kernel arming.
Jan 26 18:37:05 np0005596227.novalocal systemd[1]: Startup finished in 2.167s (kernel) + 2.599s (initrd) + 1min 20.131s (userspace) = 1min 24.899s.
Jan 26 18:37:43 np0005596227.novalocal sshd-session[4303]: Accepted publickey for zuul from 38.102.83.114 port 49076 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 26 18:37:43 np0005596227.novalocal systemd[1]: Created slice User Slice of UID 1000.
Jan 26 18:37:43 np0005596227.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 26 18:37:43 np0005596227.novalocal systemd-logind[794]: New session 1 of user zuul.
Jan 26 18:37:43 np0005596227.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 26 18:37:43 np0005596227.novalocal systemd[1]: Starting User Manager for UID 1000...
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Queued start job for default target Main User Target.
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Created slice User Application Slice.
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Reached target Paths.
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Reached target Timers.
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Starting D-Bus User Message Bus Socket...
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Starting Create User's Volatile Files and Directories...
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Listening on D-Bus User Message Bus Socket.
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Reached target Sockets.
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Finished Create User's Volatile Files and Directories.
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Reached target Basic System.
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Reached target Main User Target.
Jan 26 18:37:43 np0005596227.novalocal systemd[4307]: Startup finished in 138ms.
Jan 26 18:37:43 np0005596227.novalocal systemd[1]: Started User Manager for UID 1000.
Jan 26 18:37:43 np0005596227.novalocal systemd[1]: Started Session 1 of User zuul.
Jan 26 18:37:43 np0005596227.novalocal sshd-session[4303]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 18:37:44 np0005596227.novalocal python3[4389]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 18:37:47 np0005596227.novalocal python3[4417]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 18:37:55 np0005596227.novalocal python3[4475]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 18:37:55 np0005596227.novalocal python3[4515]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 26 18:37:57 np0005596227.novalocal python3[4541]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvX4sxj6pTlhA3V2DLrKfVZVX3fhOD27Kfndz9X9FxGIZF7OMGIufv+ctO0+cIuLo78yIvAsHkYw6Kff0/FvwcXnvA2seItausAZopMm8NPykywOc5VgCgOhufpK0VAALgtOwKqczPoSs4fQNN1Zty9WRdhLk5Ilch/RVlluuoIK5vHgoH52SO11O8FKiQw0XZ89X7Fm2qnYlktqPkbHGufP02H0xUq4xh+sq5xrxyAluu55KbbxXYra/ZR1mT/UDE5jUA32qPAVHdXl3CfwbgzlzbBTkhRVyMacu+GXAccPmOEdkcdJBZHULUBnlZdUaEJiPb3DOX8xpCdbB/bG5Vaj1BsBS4ACXb+mnk7guWyJWK6SsiEBLeJPGeM/V78/4TJVuSOVtCtw6AFc3mP071+sg9yAYWt8zaVx5MDNZrH9RfBkFATNOM4cdLBfXQqrS5UT5ip61uymSEti+agQ6zNneMSK+22aindiZI42+yvhtscbTH6dOhPUYZ55HOaqs= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:37:58 np0005596227.novalocal python3[4565]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:37:58 np0005596227.novalocal python3[4664]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:37:59 np0005596227.novalocal python3[4735]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769452678.477532-229-214786131946052/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=e9d33dc79f1c4490bd4533560605c9cd_id_rsa follow=False checksum=ae6f633871aae7210c2e1ffd22ad641d3240fd45 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:37:59 np0005596227.novalocal python3[4858]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:38:00 np0005596227.novalocal python3[4929]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769452679.4296324-273-148008723237476/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=e9d33dc79f1c4490bd4533560605c9cd_id_rsa.pub follow=False checksum=ee0af8a09e5e4efd83d4f96c64852a89fa2d19f8 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:01 np0005596227.novalocal python3[4977]: ansible-ping Invoked with data=pong
Jan 26 18:38:02 np0005596227.novalocal python3[5001]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 18:38:04 np0005596227.novalocal python3[5059]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 26 18:38:05 np0005596227.novalocal python3[5091]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:05 np0005596227.novalocal python3[5115]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:05 np0005596227.novalocal python3[5139]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:05 np0005596227.novalocal python3[5163]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:06 np0005596227.novalocal python3[5187]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:06 np0005596227.novalocal python3[5211]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:07 np0005596227.novalocal sudo[5235]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-korddjlfkjdviobrrcczuqtwybelcdzj ; /usr/bin/python3'
Jan 26 18:38:07 np0005596227.novalocal sudo[5235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:38:08 np0005596227.novalocal python3[5237]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:08 np0005596227.novalocal sudo[5235]: pam_unix(sudo:session): session closed for user root
Jan 26 18:38:08 np0005596227.novalocal sudo[5313]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iasnakacrhmlplwyeekznfsfnvxujkzk ; /usr/bin/python3'
Jan 26 18:38:08 np0005596227.novalocal sudo[5313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:38:08 np0005596227.novalocal python3[5315]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:38:08 np0005596227.novalocal sudo[5313]: pam_unix(sudo:session): session closed for user root
Jan 26 18:38:08 np0005596227.novalocal sudo[5386]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbbwmeogjgnjnnhrpmbhxuerymvjboko ; /usr/bin/python3'
Jan 26 18:38:09 np0005596227.novalocal sudo[5386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:38:09 np0005596227.novalocal python3[5388]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769452688.1856256-26-14745422042254/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:09 np0005596227.novalocal sudo[5386]: pam_unix(sudo:session): session closed for user root
Jan 26 18:38:09 np0005596227.novalocal python3[5436]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:10 np0005596227.novalocal python3[5460]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:10 np0005596227.novalocal python3[5484]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:10 np0005596227.novalocal python3[5508]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:10 np0005596227.novalocal python3[5532]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:11 np0005596227.novalocal python3[5556]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:11 np0005596227.novalocal python3[5580]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:11 np0005596227.novalocal python3[5604]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:12 np0005596227.novalocal python3[5628]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:12 np0005596227.novalocal python3[5652]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:12 np0005596227.novalocal python3[5676]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:12 np0005596227.novalocal python3[5700]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:13 np0005596227.novalocal python3[5724]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:13 np0005596227.novalocal python3[5748]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:13 np0005596227.novalocal python3[5772]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:14 np0005596227.novalocal python3[5796]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:14 np0005596227.novalocal python3[5820]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:14 np0005596227.novalocal python3[5844]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:14 np0005596227.novalocal python3[5868]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:15 np0005596227.novalocal python3[5892]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:15 np0005596227.novalocal python3[5916]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:15 np0005596227.novalocal python3[5940]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:15 np0005596227.novalocal python3[5964]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:16 np0005596227.novalocal python3[5988]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:16 np0005596227.novalocal python3[6012]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:16 np0005596227.novalocal python3[6036]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:38:19 np0005596227.novalocal sudo[6060]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfnwrnkokexgmiymambyqvjgrwpnhsxq ; /usr/bin/python3'
Jan 26 18:38:19 np0005596227.novalocal sudo[6060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:38:19 np0005596227.novalocal python3[6062]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 18:38:19 np0005596227.novalocal systemd[1]: Starting Time & Date Service...
Jan 26 18:38:19 np0005596227.novalocal systemd[1]: Started Time & Date Service.
Jan 26 18:38:19 np0005596227.novalocal systemd-timedated[6064]: Changed time zone to 'UTC' (UTC).
Jan 26 18:38:19 np0005596227.novalocal sudo[6060]: pam_unix(sudo:session): session closed for user root
Jan 26 18:38:20 np0005596227.novalocal sudo[6091]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pougfpsombsjgfhlxqesncukdlahrzhs ; /usr/bin/python3'
Jan 26 18:38:20 np0005596227.novalocal sudo[6091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:38:20 np0005596227.novalocal python3[6093]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:20 np0005596227.novalocal sudo[6091]: pam_unix(sudo:session): session closed for user root
Jan 26 18:38:20 np0005596227.novalocal python3[6169]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:38:21 np0005596227.novalocal python3[6240]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769452700.715813-202-200720107894482/source _original_basename=tmpayys19qt follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:21 np0005596227.novalocal python3[6340]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:38:22 np0005596227.novalocal python3[6411]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769452701.5729356-242-93858118209318/source _original_basename=tmpjbu_lpmy follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:22 np0005596227.novalocal sudo[6511]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rehqapnannewggjyonsltbihvbjckdeu ; /usr/bin/python3'
Jan 26 18:38:22 np0005596227.novalocal sudo[6511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:38:22 np0005596227.novalocal python3[6513]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:38:22 np0005596227.novalocal sudo[6511]: pam_unix(sudo:session): session closed for user root
Jan 26 18:38:23 np0005596227.novalocal sudo[6584]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxfkpbwaoajgvlhswkxmhdwxjbhpezpf ; /usr/bin/python3'
Jan 26 18:38:23 np0005596227.novalocal sudo[6584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:38:23 np0005596227.novalocal python3[6586]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769452702.7095315-306-46421593963476/source _original_basename=tmpe_98wpym follow=False checksum=57ca54ae8e8c8fa7c48adc35da803bb72ed5faaa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:23 np0005596227.novalocal sudo[6584]: pam_unix(sudo:session): session closed for user root
Jan 26 18:38:23 np0005596227.novalocal python3[6634]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 18:38:24 np0005596227.novalocal python3[6660]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 18:38:24 np0005596227.novalocal sudo[6738]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btdqqajzsjltswukuuvbjoxkxhlajowm ; /usr/bin/python3'
Jan 26 18:38:24 np0005596227.novalocal sudo[6738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:38:24 np0005596227.novalocal python3[6740]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:38:24 np0005596227.novalocal sudo[6738]: pam_unix(sudo:session): session closed for user root
Jan 26 18:38:24 np0005596227.novalocal sudo[6811]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqqrljlellnfieqmzshfvgwfyuvznjfr ; /usr/bin/python3'
Jan 26 18:38:24 np0005596227.novalocal sudo[6811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:38:24 np0005596227.novalocal python3[6813]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769452704.3313327-362-127045684490670/source _original_basename=tmp296v40yb follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:24 np0005596227.novalocal sudo[6811]: pam_unix(sudo:session): session closed for user root
Jan 26 18:38:25 np0005596227.novalocal sudo[6862]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvmrtxkxskjsfkobhopozmaiqjdylifj ; /usr/bin/python3'
Jan 26 18:38:25 np0005596227.novalocal sudo[6862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:38:25 np0005596227.novalocal python3[6864]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-8b62-52e1-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 18:38:25 np0005596227.novalocal sudo[6862]: pam_unix(sudo:session): session closed for user root
Jan 26 18:38:26 np0005596227.novalocal python3[6892]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-8b62-52e1-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 26 18:38:27 np0005596227.novalocal python3[6920]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:46 np0005596227.novalocal sudo[6944]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytljhplvwcmrktkjjnufjibndgpbwoee ; /usr/bin/python3'
Jan 26 18:38:46 np0005596227.novalocal sudo[6944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:38:46 np0005596227.novalocal python3[6946]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:38:46 np0005596227.novalocal sudo[6944]: pam_unix(sudo:session): session closed for user root
Jan 26 18:38:49 np0005596227.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 18:39:26 np0005596227.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 26 18:39:26 np0005596227.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 26 18:39:26 np0005596227.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 26 18:39:26 np0005596227.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 26 18:39:26 np0005596227.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 26 18:39:26 np0005596227.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 26 18:39:26 np0005596227.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 26 18:39:26 np0005596227.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 26 18:39:26 np0005596227.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 26 18:39:26 np0005596227.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 26 18:39:26 np0005596227.novalocal NetworkManager[854]: <info>  [1769452766.2525] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 18:39:26 np0005596227.novalocal systemd-udevd[6949]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 18:39:26 np0005596227.novalocal NetworkManager[854]: <info>  [1769452766.2813] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 18:39:26 np0005596227.novalocal NetworkManager[854]: <info>  [1769452766.2858] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 26 18:39:26 np0005596227.novalocal NetworkManager[854]: <info>  [1769452766.2864] device (eth1): carrier: link connected
Jan 26 18:39:26 np0005596227.novalocal NetworkManager[854]: <info>  [1769452766.2867] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 26 18:39:26 np0005596227.novalocal NetworkManager[854]: <info>  [1769452766.2877] policy: auto-activating connection 'Wired connection 1' (66289fac-35b9-3051-8520-6ca87ed17b69)
Jan 26 18:39:26 np0005596227.novalocal NetworkManager[854]: <info>  [1769452766.2885] device (eth1): Activation: starting connection 'Wired connection 1' (66289fac-35b9-3051-8520-6ca87ed17b69)
Jan 26 18:39:26 np0005596227.novalocal NetworkManager[854]: <info>  [1769452766.2887] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 18:39:26 np0005596227.novalocal NetworkManager[854]: <info>  [1769452766.2895] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 18:39:26 np0005596227.novalocal NetworkManager[854]: <info>  [1769452766.2903] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 18:39:26 np0005596227.novalocal NetworkManager[854]: <info>  [1769452766.2913] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 18:39:27 np0005596227.novalocal python3[6976]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-58d3-b2c7-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 18:39:34 np0005596227.novalocal sudo[7054]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kajxrndpkcqjdlapnihfkvvfildbnluy ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 18:39:34 np0005596227.novalocal sudo[7054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:39:34 np0005596227.novalocal python3[7056]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:39:34 np0005596227.novalocal sudo[7054]: pam_unix(sudo:session): session closed for user root
Jan 26 18:39:34 np0005596227.novalocal sudo[7127]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcwshafdmvxvubsfzinjqpipqtctszjy ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 18:39:34 np0005596227.novalocal sudo[7127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:39:34 np0005596227.novalocal python3[7129]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769452773.9329839-103-106091529011270/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=dd9605a950da53e3e539de4b9009fc6fcfd6dc5a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:39:34 np0005596227.novalocal sudo[7127]: pam_unix(sudo:session): session closed for user root
Jan 26 18:39:35 np0005596227.novalocal sudo[7177]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgrljhytijivazriyzdbmdjiipqttbgg ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 18:39:35 np0005596227.novalocal sudo[7177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:39:35 np0005596227.novalocal python3[7179]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: Stopped Network Manager Wait Online.
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: Stopping Network Manager Wait Online...
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: Stopping Network Manager...
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[854]: <info>  [1769452775.5437] caught SIGTERM, shutting down normally.
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[854]: <info>  [1769452775.5450] dhcp4 (eth0): canceled DHCP transaction
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[854]: <info>  [1769452775.5450] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[854]: <info>  [1769452775.5450] dhcp4 (eth0): state changed no lease
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[854]: <info>  [1769452775.5454] manager: NetworkManager state is now CONNECTING
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[854]: <info>  [1769452775.5642] dhcp4 (eth1): canceled DHCP transaction
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[854]: <info>  [1769452775.5642] dhcp4 (eth1): state changed no lease
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[854]: <info>  [1769452775.5710] exiting (success)
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: Stopped Network Manager.
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: NetworkManager.service: Consumed 1.756s CPU time, 10.1M memory peak.
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: Starting Network Manager...
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.6451] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:482de948-a14a-4a06-a160-ce1b2a745f1c)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.6454] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.6517] manager[0x559128c4f000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: Starting Hostname Service...
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: Started Hostname Service.
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7348] hostname: hostname: using hostnamed
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7351] hostname: static hostname changed from (none) to "np0005596227.novalocal"
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7355] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7359] manager[0x559128c4f000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7359] manager[0x559128c4f000]: rfkill: WWAN hardware radio set enabled
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7382] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7382] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7383] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7383] manager: Networking is enabled by state file
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7385] settings: Loaded settings plugin: keyfile (internal)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7388] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7408] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7417] dhcp: init: Using DHCP client 'internal'
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7418] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7423] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7427] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7432] device (lo): Activation: starting connection 'lo' (4396d3e2-241a-4088-b481-db553b6a2730)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7437] device (eth0): carrier: link connected
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7440] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7443] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7444] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7448] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7452] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7456] device (eth1): carrier: link connected
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7460] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7463] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (66289fac-35b9-3051-8520-6ca87ed17b69) (indicated)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7464] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7468] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7473] device (eth1): Activation: starting connection 'Wired connection 1' (66289fac-35b9-3051-8520-6ca87ed17b69)
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: Started Network Manager.
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7478] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7482] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7484] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7485] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7487] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7489] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7491] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7493] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7495] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7500] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7503] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7510] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7512] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7525] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7529] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7533] device (lo): Activation: successful, device activated.
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7542] dhcp4 (eth0): state changed new lease, address=38.102.83.58
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.7546] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 18:39:35 np0005596227.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 26 18:39:35 np0005596227.novalocal sudo[7177]: pam_unix(sudo:session): session closed for user root
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.8524] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.8554] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.8556] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.8561] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.8563] device (eth0): Activation: successful, device activated.
Jan 26 18:39:35 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452775.8568] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 18:39:36 np0005596227.novalocal python3[7263]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-58d3-b2c7-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 18:39:45 np0005596227.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 18:40:05 np0005596227.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 18:40:14 np0005596227.novalocal systemd[4307]: Starting Mark boot as successful...
Jan 26 18:40:14 np0005596227.novalocal systemd[4307]: Finished Mark boot as successful.
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7109] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 18:40:20 np0005596227.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 18:40:20 np0005596227.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7532] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7535] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7545] device (eth1): Activation: successful, device activated.
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7553] manager: startup complete
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7556] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <warn>  [1769452820.7563] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7573] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 26 18:40:20 np0005596227.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7681] dhcp4 (eth1): canceled DHCP transaction
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7682] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7682] dhcp4 (eth1): state changed no lease
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7695] policy: auto-activating connection 'ci-private-network' (aba96d01-0b38-5a28-a9c5-4517183729d8)
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7700] device (eth1): Activation: starting connection 'ci-private-network' (aba96d01-0b38-5a28-a9c5-4517183729d8)
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7701] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7703] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7710] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7719] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7763] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7764] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 18:40:20 np0005596227.novalocal NetworkManager[7190]: <info>  [1769452820.7769] device (eth1): Activation: successful, device activated.
Jan 26 18:40:30 np0005596227.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 18:40:36 np0005596227.novalocal sshd-session[4316]: Received disconnect from 38.102.83.114 port 49076:11: disconnected by user
Jan 26 18:40:36 np0005596227.novalocal sshd-session[4316]: Disconnected from user zuul 38.102.83.114 port 49076
Jan 26 18:40:36 np0005596227.novalocal sshd-session[4303]: pam_unix(sshd:session): session closed for user zuul
Jan 26 18:40:36 np0005596227.novalocal systemd-logind[794]: Session 1 logged out. Waiting for processes to exit.
Jan 26 18:40:58 np0005596227.novalocal sshd-session[7292]: Accepted publickey for zuul from 38.102.83.114 port 43874 ssh2: RSA SHA256:z5yXGhLPOexSpG1aW8Zw3EOkOyqOHIRm+m5qLpa/9+A
Jan 26 18:40:58 np0005596227.novalocal systemd-logind[794]: New session 3 of user zuul.
Jan 26 18:40:58 np0005596227.novalocal systemd[1]: Started Session 3 of User zuul.
Jan 26 18:40:58 np0005596227.novalocal sshd-session[7292]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 18:40:58 np0005596227.novalocal sudo[7371]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcijpnvhupzhnqwvmkheithjdqjwajby ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 18:40:58 np0005596227.novalocal sudo[7371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:40:58 np0005596227.novalocal python3[7373]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:40:58 np0005596227.novalocal sudo[7371]: pam_unix(sudo:session): session closed for user root
Jan 26 18:40:58 np0005596227.novalocal sudo[7444]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmvjxjmsjagzoarvdwnpmqqmhkaqfgyv ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 26 18:40:58 np0005596227.novalocal sudo[7444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:40:59 np0005596227.novalocal python3[7446]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769452858.404649-312-180756474691475/source _original_basename=tmp38j4faxz follow=False checksum=1540cf424aeb0b91fb9ff0c8f481b0276f419bce backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:40:59 np0005596227.novalocal sudo[7444]: pam_unix(sudo:session): session closed for user root
Jan 26 18:41:02 np0005596227.novalocal sshd-session[7295]: Connection closed by 38.102.83.114 port 43874
Jan 26 18:41:02 np0005596227.novalocal sshd-session[7292]: pam_unix(sshd:session): session closed for user zuul
Jan 26 18:41:02 np0005596227.novalocal systemd-logind[794]: Session 3 logged out. Waiting for processes to exit.
Jan 26 18:41:02 np0005596227.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Jan 26 18:41:02 np0005596227.novalocal systemd-logind[794]: Removed session 3.
Jan 26 18:43:14 np0005596227.novalocal systemd[4307]: Created slice User Background Tasks Slice.
Jan 26 18:43:14 np0005596227.novalocal systemd[4307]: Starting Cleanup of User's Temporary Files and Directories...
Jan 26 18:43:14 np0005596227.novalocal systemd[4307]: Finished Cleanup of User's Temporary Files and Directories.
Jan 26 18:47:47 np0005596227.novalocal sshd-session[7478]: Accepted publickey for zuul from 38.102.83.114 port 48430 ssh2: RSA SHA256:z5yXGhLPOexSpG1aW8Zw3EOkOyqOHIRm+m5qLpa/9+A
Jan 26 18:47:47 np0005596227.novalocal systemd-logind[794]: New session 4 of user zuul.
Jan 26 18:47:47 np0005596227.novalocal systemd[1]: Started Session 4 of User zuul.
Jan 26 18:47:47 np0005596227.novalocal sshd-session[7478]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 18:47:47 np0005596227.novalocal sudo[7505]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mocdcjycdojswkzmsfiwdbhnztolqjoq ; /usr/bin/python3'
Jan 26 18:47:47 np0005596227.novalocal sudo[7505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:47 np0005596227.novalocal python3[7507]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-b71c-b447-000000002172-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 18:47:47 np0005596227.novalocal sudo[7505]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:47 np0005596227.novalocal sudo[7534]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geaamrigqzdtjsbuclzazrutnwnmuwub ; /usr/bin/python3'
Jan 26 18:47:47 np0005596227.novalocal sudo[7534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:47 np0005596227.novalocal python3[7536]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:47:47 np0005596227.novalocal sudo[7534]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:47 np0005596227.novalocal sudo[7560]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjqrkalqvvutommzmlhinpsaypnaowsr ; /usr/bin/python3'
Jan 26 18:47:47 np0005596227.novalocal sudo[7560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:47 np0005596227.novalocal python3[7562]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:47:47 np0005596227.novalocal sudo[7560]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:48 np0005596227.novalocal sudo[7586]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svyvkikingnatazxjbmnhpdleqyqiaow ; /usr/bin/python3'
Jan 26 18:47:48 np0005596227.novalocal sudo[7586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:48 np0005596227.novalocal python3[7588]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:47:48 np0005596227.novalocal sudo[7586]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:48 np0005596227.novalocal sudo[7612]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyuhfavmxostlreejmrwusutfssqxlvn ; /usr/bin/python3'
Jan 26 18:47:48 np0005596227.novalocal sudo[7612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:48 np0005596227.novalocal python3[7614]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:47:48 np0005596227.novalocal sudo[7612]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:48 np0005596227.novalocal sudo[7638]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmjhaayqydiygueyqgyzxufcayuhdrjq ; /usr/bin/python3'
Jan 26 18:47:48 np0005596227.novalocal sudo[7638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:49 np0005596227.novalocal python3[7640]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:47:49 np0005596227.novalocal sudo[7638]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:49 np0005596227.novalocal sudo[7716]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frsmyxrfhfuqnfgogmdgihzghzgxcvtq ; /usr/bin/python3'
Jan 26 18:47:49 np0005596227.novalocal sudo[7716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:49 np0005596227.novalocal python3[7718]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:47:49 np0005596227.novalocal sudo[7716]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:49 np0005596227.novalocal sudo[7789]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtpbpstfpplenpkxdtuwhxjudnjfjdsf ; /usr/bin/python3'
Jan 26 18:47:49 np0005596227.novalocal sudo[7789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:50 np0005596227.novalocal python3[7791]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769453269.367405-516-21962397915080/source _original_basename=tmpful_y9ho follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:47:50 np0005596227.novalocal sudo[7789]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:50 np0005596227.novalocal sudo[7839]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prucfpjtynmqfinklhqpcoxwkucsgrqe ; /usr/bin/python3'
Jan 26 18:47:50 np0005596227.novalocal sudo[7839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:50 np0005596227.novalocal python3[7841]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 18:47:50 np0005596227.novalocal systemd[1]: Reloading.
Jan 26 18:47:51 np0005596227.novalocal systemd-rc-local-generator[7859]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 18:47:51 np0005596227.novalocal sudo[7839]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:52 np0005596227.novalocal sudo[7895]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmukyezyxnuyjgqljghpcsvclrmfghnx ; /usr/bin/python3'
Jan 26 18:47:52 np0005596227.novalocal sudo[7895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:52 np0005596227.novalocal python3[7897]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 26 18:47:52 np0005596227.novalocal sudo[7895]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:52 np0005596227.novalocal sudo[7921]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwnnbdssheoipdhpxnihfwujdmmrpcau ; /usr/bin/python3'
Jan 26 18:47:52 np0005596227.novalocal sudo[7921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:53 np0005596227.novalocal python3[7923]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 18:47:53 np0005596227.novalocal sudo[7921]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:53 np0005596227.novalocal sudo[7949]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spxujcqtdnmoezsuqofbvhcfobehoeml ; /usr/bin/python3'
Jan 26 18:47:53 np0005596227.novalocal sudo[7949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:53 np0005596227.novalocal python3[7951]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 18:47:53 np0005596227.novalocal sudo[7949]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:53 np0005596227.novalocal sudo[7977]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfsvaytgchuxerrursvqpqrdmszljewc ; /usr/bin/python3'
Jan 26 18:47:53 np0005596227.novalocal sudo[7977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:53 np0005596227.novalocal python3[7979]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 18:47:53 np0005596227.novalocal sudo[7977]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:53 np0005596227.novalocal sudo[8005]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wthpsuxxptlqijbgzmlelgeefgoprmie ; /usr/bin/python3'
Jan 26 18:47:53 np0005596227.novalocal sudo[8005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:53 np0005596227.novalocal python3[8007]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 18:47:53 np0005596227.novalocal sudo[8005]: pam_unix(sudo:session): session closed for user root
Jan 26 18:47:54 np0005596227.novalocal python3[8034]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-b71c-b447-000000002179-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 18:47:55 np0005596227.novalocal python3[8064]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 26 18:47:57 np0005596227.novalocal sshd-session[7481]: Connection closed by 38.102.83.114 port 48430
Jan 26 18:47:57 np0005596227.novalocal sshd-session[7478]: pam_unix(sshd:session): session closed for user zuul
Jan 26 18:47:57 np0005596227.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Jan 26 18:47:57 np0005596227.novalocal systemd[1]: session-4.scope: Consumed 4.290s CPU time.
Jan 26 18:47:57 np0005596227.novalocal systemd-logind[794]: Session 4 logged out. Waiting for processes to exit.
Jan 26 18:47:57 np0005596227.novalocal systemd-logind[794]: Removed session 4.
Jan 26 18:47:59 np0005596227.novalocal sshd-session[8070]: Accepted publickey for zuul from 38.102.83.114 port 54390 ssh2: RSA SHA256:z5yXGhLPOexSpG1aW8Zw3EOkOyqOHIRm+m5qLpa/9+A
Jan 26 18:47:59 np0005596227.novalocal systemd-logind[794]: New session 5 of user zuul.
Jan 26 18:47:59 np0005596227.novalocal systemd[1]: Started Session 5 of User zuul.
Jan 26 18:47:59 np0005596227.novalocal sshd-session[8070]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 18:47:59 np0005596227.novalocal sudo[8097]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ondtshkfvmtqhiizayjlakftxaccrijz ; /usr/bin/python3'
Jan 26 18:47:59 np0005596227.novalocal sudo[8097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:47:59 np0005596227.novalocal python3[8099]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 26 18:48:04 np0005596227.novalocal setsebool[8141]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 26 18:48:04 np0005596227.novalocal setsebool[8141]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 26 18:48:17 np0005596227.novalocal kernel: SELinux:  Converting 385 SID table entries...
Jan 26 18:48:17 np0005596227.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 18:48:17 np0005596227.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 26 18:48:17 np0005596227.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 18:48:17 np0005596227.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 26 18:48:17 np0005596227.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 18:48:17 np0005596227.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 18:48:17 np0005596227.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 18:48:29 np0005596227.novalocal kernel: SELinux:  Converting 388 SID table entries...
Jan 26 18:48:29 np0005596227.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 18:48:29 np0005596227.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 26 18:48:29 np0005596227.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 18:48:29 np0005596227.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 26 18:48:29 np0005596227.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 18:48:29 np0005596227.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 18:48:29 np0005596227.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 18:48:47 np0005596227.novalocal dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 26 18:48:47 np0005596227.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 18:48:48 np0005596227.novalocal systemd[1]: Starting man-db-cache-update.service...
Jan 26 18:48:48 np0005596227.novalocal systemd[1]: Reloading.
Jan 26 18:48:48 np0005596227.novalocal systemd-rc-local-generator[8908]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 18:48:48 np0005596227.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 18:48:49 np0005596227.novalocal sudo[8097]: pam_unix(sudo:session): session closed for user root
Jan 26 18:49:00 np0005596227.novalocal python3[15984]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-d00e-a4f8-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 18:49:01 np0005596227.novalocal kernel: evm: overlay not supported
Jan 26 18:49:01 np0005596227.novalocal systemd[4307]: Starting D-Bus User Message Bus...
Jan 26 18:49:01 np0005596227.novalocal dbus-broker-launch[16475]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 26 18:49:01 np0005596227.novalocal dbus-broker-launch[16475]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 26 18:49:01 np0005596227.novalocal systemd[4307]: Started D-Bus User Message Bus.
Jan 26 18:49:01 np0005596227.novalocal dbus-broker-lau[16475]: Ready
Jan 26 18:49:01 np0005596227.novalocal systemd[4307]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 26 18:49:01 np0005596227.novalocal systemd[4307]: Created slice Slice /user.
Jan 26 18:49:01 np0005596227.novalocal systemd[4307]: podman-16402.scope: unit configures an IP firewall, but not running as root.
Jan 26 18:49:01 np0005596227.novalocal systemd[4307]: (This warning is only shown for the first unit using IP firewalling.)
Jan 26 18:49:01 np0005596227.novalocal systemd[4307]: Started podman-16402.scope.
Jan 26 18:49:02 np0005596227.novalocal systemd[4307]: Started podman-pause-3ad3df3e.scope.
Jan 26 18:49:02 np0005596227.novalocal sudo[16852]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipgwbrojqbqhdmvqpgomahdjrnmqpdyu ; /usr/bin/python3'
Jan 26 18:49:02 np0005596227.novalocal sudo[16852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:49:02 np0005596227.novalocal python3[16865]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.223:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.223:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:49:02 np0005596227.novalocal python3[16865]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 26 18:49:02 np0005596227.novalocal sudo[16852]: pam_unix(sudo:session): session closed for user root
Jan 26 18:49:03 np0005596227.novalocal sshd-session[8073]: Connection closed by 38.102.83.114 port 54390
Jan 26 18:49:03 np0005596227.novalocal sshd-session[8070]: pam_unix(sshd:session): session closed for user zuul
Jan 26 18:49:03 np0005596227.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Jan 26 18:49:03 np0005596227.novalocal systemd[1]: session-5.scope: Consumed 45.869s CPU time.
Jan 26 18:49:03 np0005596227.novalocal systemd-logind[794]: Session 5 logged out. Waiting for processes to exit.
Jan 26 18:49:03 np0005596227.novalocal systemd-logind[794]: Removed session 5.
Jan 26 18:49:22 np0005596227.novalocal sshd-session[24349]: Connection closed by 38.102.83.66 port 37768 [preauth]
Jan 26 18:49:22 np0005596227.novalocal sshd-session[24353]: Connection closed by 38.102.83.66 port 37774 [preauth]
Jan 26 18:49:22 np0005596227.novalocal sshd-session[24351]: Unable to negotiate with 38.102.83.66 port 37790: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 26 18:49:22 np0005596227.novalocal sshd-session[24352]: Unable to negotiate with 38.102.83.66 port 37794: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 26 18:49:22 np0005596227.novalocal sshd-session[24350]: Unable to negotiate with 38.102.83.66 port 37796: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 26 18:49:24 np0005596227.novalocal systemd[1]: Starting dnf makecache...
Jan 26 18:49:25 np0005596227.novalocal dnf[25089]: Failed determining last makecache time.
Jan 26 18:49:25 np0005596227.novalocal dnf[25089]: CentOS Stream 9 - BaseOS                         66 kB/s | 6.7 kB     00:00
Jan 26 18:49:25 np0005596227.novalocal dnf[25089]: CentOS Stream 9 - AppStream                      66 kB/s | 6.8 kB     00:00
Jan 26 18:49:25 np0005596227.novalocal dnf[25089]: CentOS Stream 9 - CRB                            65 kB/s | 6.6 kB     00:00
Jan 26 18:49:25 np0005596227.novalocal dnf[25089]: CentOS Stream 9 - Extras packages                66 kB/s | 7.3 kB     00:00
Jan 26 18:49:25 np0005596227.novalocal dnf[25089]: Metadata cache created.
Jan 26 18:49:26 np0005596227.novalocal systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 26 18:49:26 np0005596227.novalocal systemd[1]: Finished dnf makecache.
Jan 26 18:49:27 np0005596227.novalocal sshd-session[26184]: Accepted publickey for zuul from 38.102.83.114 port 60384 ssh2: RSA SHA256:z5yXGhLPOexSpG1aW8Zw3EOkOyqOHIRm+m5qLpa/9+A
Jan 26 18:49:27 np0005596227.novalocal systemd-logind[794]: New session 6 of user zuul.
Jan 26 18:49:27 np0005596227.novalocal systemd[1]: Started Session 6 of User zuul.
Jan 26 18:49:27 np0005596227.novalocal sshd-session[26184]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 18:49:28 np0005596227.novalocal python3[26295]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL9+A9f54EofO8tTk+DT2Zpvdx83+1CPUE4KHVOwq5vDZt1vMsf3pC7LUvQEpq8kOHrdlIKIPeuEfM2wKITj/8M= zuul@np0005596226.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:49:29 np0005596227.novalocal sudo[26456]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcqrljpdqobuphjerkidmkrpiollzsjg ; /usr/bin/python3'
Jan 26 18:49:29 np0005596227.novalocal sudo[26456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:49:29 np0005596227.novalocal python3[26461]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL9+A9f54EofO8tTk+DT2Zpvdx83+1CPUE4KHVOwq5vDZt1vMsf3pC7LUvQEpq8kOHrdlIKIPeuEfM2wKITj/8M= zuul@np0005596226.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:49:29 np0005596227.novalocal sudo[26456]: pam_unix(sudo:session): session closed for user root
Jan 26 18:49:29 np0005596227.novalocal sudo[26726]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czuyduaxwusnubvrriemvxgdnqxvbpus ; /usr/bin/python3'
Jan 26 18:49:29 np0005596227.novalocal sudo[26726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:49:29 np0005596227.novalocal python3[26739]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005596227.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 26 18:49:30 np0005596227.novalocal useradd[26813]: new group: name=cloud-admin, GID=1002
Jan 26 18:49:30 np0005596227.novalocal useradd[26813]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 26 18:49:30 np0005596227.novalocal sudo[26726]: pam_unix(sudo:session): session closed for user root
Jan 26 18:49:30 np0005596227.novalocal sudo[26959]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzltqnfhfvbibsbpvhguyvcxebdgokru ; /usr/bin/python3'
Jan 26 18:49:30 np0005596227.novalocal sudo[26959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:49:30 np0005596227.novalocal python3[26975]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL9+A9f54EofO8tTk+DT2Zpvdx83+1CPUE4KHVOwq5vDZt1vMsf3pC7LUvQEpq8kOHrdlIKIPeuEfM2wKITj/8M= zuul@np0005596226.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 26 18:49:30 np0005596227.novalocal sudo[26959]: pam_unix(sudo:session): session closed for user root
Jan 26 18:49:30 np0005596227.novalocal sudo[27246]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzmmuazpxosidlzjfggjufodupiskqwh ; /usr/bin/python3'
Jan 26 18:49:30 np0005596227.novalocal sudo[27246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:49:30 np0005596227.novalocal python3[27257]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:49:30 np0005596227.novalocal sudo[27246]: pam_unix(sudo:session): session closed for user root
Jan 26 18:49:31 np0005596227.novalocal sudo[27533]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bulwkoyjrbrfdcalcuqhgyxlsgygpwwb ; /usr/bin/python3'
Jan 26 18:49:31 np0005596227.novalocal sudo[27533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:49:31 np0005596227.novalocal python3[27543]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769453370.603197-151-58632120342538/source _original_basename=tmps57o3nrp follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:49:31 np0005596227.novalocal sudo[27533]: pam_unix(sudo:session): session closed for user root
Jan 26 18:49:31 np0005596227.novalocal sudo[27925]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofhcqwrhhipkcmkfnxtmtjrucgsyipkx ; /usr/bin/python3'
Jan 26 18:49:31 np0005596227.novalocal sudo[27925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:49:32 np0005596227.novalocal python3[27931]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Jan 26 18:49:32 np0005596227.novalocal systemd[1]: Starting Hostname Service...
Jan 26 18:49:32 np0005596227.novalocal systemd[1]: Started Hostname Service.
Jan 26 18:49:32 np0005596227.novalocal systemd-hostnamed[28017]: Changed pretty hostname to 'compute-0'
Jan 26 18:49:32 compute-0 systemd-hostnamed[28017]: Hostname set to <compute-0> (static)
Jan 26 18:49:32 compute-0 NetworkManager[7190]: <info>  [1769453372.3703] hostname: static hostname changed from "np0005596227.novalocal" to "compute-0"
Jan 26 18:49:32 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 18:49:32 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 18:49:32 compute-0 sudo[27925]: pam_unix(sudo:session): session closed for user root
Jan 26 18:49:33 compute-0 sshd-session[26234]: Connection closed by 38.102.83.114 port 60384
Jan 26 18:49:33 compute-0 sshd-session[26184]: pam_unix(sshd:session): session closed for user zuul
Jan 26 18:49:33 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Jan 26 18:49:33 compute-0 systemd[1]: session-6.scope: Consumed 2.228s CPU time.
Jan 26 18:49:33 compute-0 systemd-logind[794]: Session 6 logged out. Waiting for processes to exit.
Jan 26 18:49:33 compute-0 systemd-logind[794]: Removed session 6.
Jan 26 18:49:37 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 18:49:37 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 26 18:49:37 compute-0 systemd[1]: man-db-cache-update.service: Consumed 59.173s CPU time.
Jan 26 18:49:37 compute-0 systemd[1]: run-r6fb52009c4604d65a6d65f66ee54e61f.service: Deactivated successfully.
Jan 26 18:49:42 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 18:50:02 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 18:51:14 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 26 18:51:14 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 26 18:51:14 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 26 18:51:14 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 26 18:53:46 compute-0 sshd-session[29938]: Accepted publickey for zuul from 38.102.83.66 port 37678 ssh2: RSA SHA256:z5yXGhLPOexSpG1aW8Zw3EOkOyqOHIRm+m5qLpa/9+A
Jan 26 18:53:46 compute-0 systemd-logind[794]: New session 7 of user zuul.
Jan 26 18:53:46 compute-0 systemd[1]: Started Session 7 of User zuul.
Jan 26 18:53:46 compute-0 sshd-session[29938]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 18:53:47 compute-0 python3[30014]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 18:53:48 compute-0 sudo[30128]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yteagyjzmyynsvcgrlzlhwionjqtohkm ; /usr/bin/python3'
Jan 26 18:53:48 compute-0 sudo[30128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:49 compute-0 python3[30130]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:53:49 compute-0 sudo[30128]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:49 compute-0 sudo[30201]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tifvkxbgmymrbjbsixcmoweukzmivtky ; /usr/bin/python3'
Jan 26 18:53:49 compute-0 sudo[30201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:49 compute-0 python3[30203]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769453628.7547812-33923-151310681581982/source mode=0755 _original_basename=delorean.repo follow=False checksum=2e65f5781089f6db35f20eae2311859479a007a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:53:49 compute-0 sudo[30201]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:49 compute-0 sudo[30227]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svitujfbresvoyspfbhouiyqrgdhcjps ; /usr/bin/python3'
Jan 26 18:53:49 compute-0 sudo[30227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:49 compute-0 python3[30229]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:53:49 compute-0 sudo[30227]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:50 compute-0 sudo[30300]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cesopqefhiofkcpzdjmibxnabbrdxldr ; /usr/bin/python3'
Jan 26 18:53:50 compute-0 sudo[30300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:50 compute-0 python3[30302]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769453628.7547812-33923-151310681581982/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=2c5ad31b3cd5c5b96a9995d83e342833f9bd7020 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:53:50 compute-0 sudo[30300]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:50 compute-0 sudo[30326]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdzlpjbeokzyhgplfydvilooxljdhghs ; /usr/bin/python3'
Jan 26 18:53:50 compute-0 sudo[30326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:50 compute-0 python3[30328]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:53:50 compute-0 sudo[30326]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:50 compute-0 sudo[30399]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-errnsibbtenosuzfofwpnidoxeqgrovo ; /usr/bin/python3'
Jan 26 18:53:50 compute-0 sudo[30399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:51 compute-0 python3[30401]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769453628.7547812-33923-151310681581982/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:53:51 compute-0 sudo[30399]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:51 compute-0 sudo[30425]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alnomwfvuatixvhoafbcoohqofebcani ; /usr/bin/python3'
Jan 26 18:53:51 compute-0 sudo[30425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:51 compute-0 python3[30427]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:53:51 compute-0 sudo[30425]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:51 compute-0 sudo[30498]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrrlhhjrtoyufflijkgszufpxnjcfvte ; /usr/bin/python3'
Jan 26 18:53:51 compute-0 sudo[30498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:51 compute-0 python3[30500]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769453628.7547812-33923-151310681581982/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:53:51 compute-0 sudo[30498]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:51 compute-0 sudo[30524]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaehskrqwykfazfverwujiokcqborgwv ; /usr/bin/python3'
Jan 26 18:53:51 compute-0 sudo[30524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:51 compute-0 python3[30526]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:53:51 compute-0 sudo[30524]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:52 compute-0 sudo[30597]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxeaejvabrrcssuveouaqmdvzorekzxk ; /usr/bin/python3'
Jan 26 18:53:52 compute-0 sudo[30597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:52 compute-0 python3[30599]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769453628.7547812-33923-151310681581982/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:53:52 compute-0 sudo[30597]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:52 compute-0 sudo[30623]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atvcydwdvehieyfzbsydvualszjelqjs ; /usr/bin/python3'
Jan 26 18:53:52 compute-0 sudo[30623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:52 compute-0 python3[30625]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:53:52 compute-0 sudo[30623]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:52 compute-0 sudo[30696]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvobtwsdaacjbgqxplaftdwtkyqcdhja ; /usr/bin/python3'
Jan 26 18:53:52 compute-0 sudo[30696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:53 compute-0 python3[30698]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769453628.7547812-33923-151310681581982/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:53:53 compute-0 sudo[30696]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:53 compute-0 sudo[30722]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdjtgkfmtilqgdydgqtgdhlxyplrpvos ; /usr/bin/python3'
Jan 26 18:53:53 compute-0 sudo[30722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:53 compute-0 python3[30724]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:53:53 compute-0 sudo[30722]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:53 compute-0 sudo[30795]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvdhiympzvljnjitbmzzutygobednzyy ; /usr/bin/python3'
Jan 26 18:53:53 compute-0 sudo[30795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:53 compute-0 python3[30797]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769453628.7547812-33923-151310681581982/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=aa03f96b62b2a238943efcc5a547883c212e7d56 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:53:53 compute-0 sudo[30795]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:53 compute-0 sudo[30821]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trdrcwkgjsefxrpkrtbsawpjkgxxxbpz ; /usr/bin/python3'
Jan 26 18:53:53 compute-0 sudo[30821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:54 compute-0 python3[30823]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/gating.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 18:53:54 compute-0 sudo[30821]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:54 compute-0 sudo[30894]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aabwtxrkrrgxtvvwtnlkvbrjidhgpsga ; /usr/bin/python3'
Jan 26 18:53:54 compute-0 sudo[30894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 18:53:54 compute-0 python3[30896]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769453628.7547812-33923-151310681581982/source mode=0755 _original_basename=gating.repo follow=False checksum=5aa82d5476c1c66694b9b67e60505de4ae915bc7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 18:53:54 compute-0 sudo[30894]: pam_unix(sudo:session): session closed for user root
Jan 26 18:53:56 compute-0 sshd-session[30922]: Connection closed by 192.168.122.11 port 60428 [preauth]
Jan 26 18:53:56 compute-0 sshd-session[30921]: Connection closed by 192.168.122.11 port 60422 [preauth]
Jan 26 18:53:56 compute-0 sshd-session[30923]: Unable to negotiate with 192.168.122.11 port 60436: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 26 18:53:56 compute-0 sshd-session[30924]: Unable to negotiate with 192.168.122.11 port 60444: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 26 18:53:56 compute-0 sshd-session[30926]: Unable to negotiate with 192.168.122.11 port 60446: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 26 18:54:52 compute-0 python3[30955]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 18:57:30 compute-0 chronyd[783]: Selected source 206.108.0.133 (2.centos.pool.ntp.org)
Jan 26 18:59:51 compute-0 sshd-session[29941]: Received disconnect from 38.102.83.66 port 37678:11: disconnected by user
Jan 26 18:59:51 compute-0 sshd-session[29941]: Disconnected from user zuul 38.102.83.66 port 37678
Jan 26 18:59:51 compute-0 sshd-session[29938]: pam_unix(sshd:session): session closed for user zuul
Jan 26 18:59:51 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Jan 26 18:59:51 compute-0 systemd[1]: session-7.scope: Consumed 6.194s CPU time.
Jan 26 18:59:51 compute-0 systemd-logind[794]: Session 7 logged out. Waiting for processes to exit.
Jan 26 18:59:51 compute-0 systemd-logind[794]: Removed session 7.
Jan 26 19:01:01 compute-0 CROND[30963]: (root) CMD (run-parts /etc/cron.hourly)
Jan 26 19:01:01 compute-0 run-parts[30966]: (/etc/cron.hourly) starting 0anacron
Jan 26 19:01:01 compute-0 anacron[30974]: Anacron started on 2026-01-26
Jan 26 19:01:01 compute-0 anacron[30974]: Will run job `cron.daily' in 41 min.
Jan 26 19:01:01 compute-0 anacron[30974]: Will run job `cron.weekly' in 61 min.
Jan 26 19:01:01 compute-0 anacron[30974]: Will run job `cron.monthly' in 81 min.
Jan 26 19:01:01 compute-0 anacron[30974]: Jobs will be executed sequentially
Jan 26 19:01:01 compute-0 run-parts[30976]: (/etc/cron.hourly) finished 0anacron
Jan 26 19:01:01 compute-0 CROND[30962]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 26 19:06:49 compute-0 sshd-session[30981]: Accepted publickey for zuul from 192.168.122.30 port 42388 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:06:49 compute-0 systemd-logind[794]: New session 8 of user zuul.
Jan 26 19:06:49 compute-0 systemd[1]: Started Session 8 of User zuul.
Jan 26 19:06:49 compute-0 sshd-session[30981]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:06:50 compute-0 python3.9[31134]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:06:51 compute-0 sudo[31313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnfzajsjjhdpneufqelvtdlbvbinotax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454410.9803045-39-239693398137437/AnsiballZ_command.py'
Jan 26 19:06:51 compute-0 sudo[31313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:06:51 compute-0 python3.9[31315]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:06:59 compute-0 sudo[31313]: pam_unix(sudo:session): session closed for user root
Jan 26 19:06:59 compute-0 sshd-session[30984]: Connection closed by 192.168.122.30 port 42388
Jan 26 19:06:59 compute-0 sshd-session[30981]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:06:59 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Jan 26 19:06:59 compute-0 systemd[1]: session-8.scope: Consumed 8.547s CPU time.
Jan 26 19:06:59 compute-0 systemd-logind[794]: Session 8 logged out. Waiting for processes to exit.
Jan 26 19:06:59 compute-0 systemd-logind[794]: Removed session 8.
Jan 26 19:07:05 compute-0 sshd-session[31373]: Accepted publickey for zuul from 192.168.122.30 port 45228 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:07:05 compute-0 systemd-logind[794]: New session 9 of user zuul.
Jan 26 19:07:05 compute-0 systemd[1]: Started Session 9 of User zuul.
Jan 26 19:07:05 compute-0 sshd-session[31373]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:07:06 compute-0 python3.9[31526]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:07:06 compute-0 sshd-session[31376]: Connection closed by 192.168.122.30 port 45228
Jan 26 19:07:06 compute-0 sshd-session[31373]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:07:06 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Jan 26 19:07:06 compute-0 systemd-logind[794]: Session 9 logged out. Waiting for processes to exit.
Jan 26 19:07:06 compute-0 systemd-logind[794]: Removed session 9.
Jan 26 19:07:22 compute-0 sshd-session[31553]: Accepted publickey for zuul from 192.168.122.30 port 56010 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:07:22 compute-0 systemd-logind[794]: New session 10 of user zuul.
Jan 26 19:07:22 compute-0 systemd[1]: Started Session 10 of User zuul.
Jan 26 19:07:22 compute-0 sshd-session[31553]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:07:23 compute-0 python3.9[31706]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 26 19:07:24 compute-0 python3.9[31880]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:07:25 compute-0 sudo[32030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzwyrbhfldpreixsxuymxnakuizjsamm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454444.7112648-64-215831922515535/AnsiballZ_command.py'
Jan 26 19:07:25 compute-0 sudo[32030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:07:25 compute-0 python3.9[32032]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:07:25 compute-0 sudo[32030]: pam_unix(sudo:session): session closed for user root
Jan 26 19:07:26 compute-0 sudo[32183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfsxwufmjtcxxnelarwwueyzxesgzaag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454445.632451-88-77530426302635/AnsiballZ_stat.py'
Jan 26 19:07:26 compute-0 sudo[32183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:07:26 compute-0 python3.9[32185]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:07:26 compute-0 sudo[32183]: pam_unix(sudo:session): session closed for user root
Jan 26 19:07:26 compute-0 sudo[32335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsxpkdfqdkilckhsskogynfuulwkhgin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454446.5312345-104-145188134442385/AnsiballZ_file.py'
Jan 26 19:07:26 compute-0 sudo[32335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:07:27 compute-0 python3.9[32337]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:07:27 compute-0 sudo[32335]: pam_unix(sudo:session): session closed for user root
Jan 26 19:07:27 compute-0 sudo[32487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olxkxayrqibpqkxipxcapmahnnatvaeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454447.3610563-120-129680312732576/AnsiballZ_stat.py'
Jan 26 19:07:27 compute-0 sudo[32487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:07:27 compute-0 python3.9[32489]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:07:27 compute-0 sudo[32487]: pam_unix(sudo:session): session closed for user root
Jan 26 19:07:28 compute-0 sudo[32610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwvsxgdcrewozaznhcxuwrpacfzcfobx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454447.3610563-120-129680312732576/AnsiballZ_copy.py'
Jan 26 19:07:28 compute-0 sudo[32610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:07:28 compute-0 python3.9[32612]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769454447.3610563-120-129680312732576/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:07:28 compute-0 sudo[32610]: pam_unix(sudo:session): session closed for user root
Jan 26 19:07:28 compute-0 sudo[32762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uauohotoavgfypwqzmcpmbetghjzwhmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454448.682653-150-147453960425820/AnsiballZ_setup.py'
Jan 26 19:07:28 compute-0 sudo[32762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:07:29 compute-0 python3.9[32764]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:07:29 compute-0 sudo[32762]: pam_unix(sudo:session): session closed for user root
Jan 26 19:07:29 compute-0 sudo[32918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zufjtnjkphtqxscumwbuouijjnnhymms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454449.6720974-166-47838945884665/AnsiballZ_file.py'
Jan 26 19:07:29 compute-0 sudo[32918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:07:30 compute-0 python3.9[32920]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:07:30 compute-0 sudo[32918]: pam_unix(sudo:session): session closed for user root
Jan 26 19:07:30 compute-0 sudo[33070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhjddcbrcxcmhdhkyjyaoniiypjozvsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454450.3373525-184-94631865785302/AnsiballZ_file.py'
Jan 26 19:07:30 compute-0 sudo[33070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:07:30 compute-0 python3.9[33072]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:07:31 compute-0 sudo[33070]: pam_unix(sudo:session): session closed for user root
Jan 26 19:07:31 compute-0 python3.9[33222]: ansible-ansible.builtin.service_facts Invoked
Jan 26 19:07:36 compute-0 python3.9[33475]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:07:36 compute-0 python3.9[33625]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:07:38 compute-0 python3.9[33779]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:07:38 compute-0 sudo[33935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyvwalormysyeqycpvxnbiklkjhcslwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454458.6460989-280-213803152564816/AnsiballZ_setup.py'
Jan 26 19:07:38 compute-0 sudo[33935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:07:39 compute-0 python3.9[33937]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:07:39 compute-0 sudo[33935]: pam_unix(sudo:session): session closed for user root
Jan 26 19:07:40 compute-0 sudo[34019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puocboydapurunifdwqhkxoatjkvmqvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454458.6460989-280-213803152564816/AnsiballZ_dnf.py'
Jan 26 19:07:40 compute-0 sudo[34019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:07:40 compute-0 python3.9[34021]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:08:27 compute-0 systemd[1]: Reloading.
Jan 26 19:08:27 compute-0 systemd-rc-local-generator[34214]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:08:27 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 26 19:08:27 compute-0 systemd[1]: Reloading.
Jan 26 19:08:28 compute-0 systemd-rc-local-generator[34258]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:08:28 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 26 19:08:28 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 26 19:08:28 compute-0 systemd[1]: Reloading.
Jan 26 19:08:28 compute-0 systemd-rc-local-generator[34300]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:08:28 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 26 19:08:28 compute-0 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 26 19:08:28 compute-0 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 26 19:08:28 compute-0 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 26 19:08:30 compute-0 sshd-session[34323]: Connection closed by 193.32.162.151 port 58974
Jan 26 19:09:33 compute-0 kernel: SELinux:  Converting 2724 SID table entries...
Jan 26 19:09:33 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 19:09:33 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 26 19:09:33 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 19:09:33 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 26 19:09:33 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 19:09:33 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 19:09:33 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 19:09:33 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 26 19:09:33 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 19:09:33 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 26 19:09:33 compute-0 systemd[1]: Reloading.
Jan 26 19:09:33 compute-0 systemd-rc-local-generator[34622]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:09:33 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 19:09:34 compute-0 sudo[34019]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:34 compute-0 sudo[35533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glhsoaspwjffbbhzbwwwlnyhvfctteej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454574.4085205-304-128135567629409/AnsiballZ_command.py'
Jan 26 19:09:34 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 19:09:34 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 26 19:09:34 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.175s CPU time.
Jan 26 19:09:34 compute-0 systemd[1]: run-r2d85c4512eff48d28f55aacd605aeb0d.service: Deactivated successfully.
Jan 26 19:09:34 compute-0 sudo[35533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:34 compute-0 python3.9[35536]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:09:35 compute-0 sudo[35533]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:36 compute-0 sudo[35816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqbevgmwidrunzzhyrcfpkowgwqwufri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454576.006675-320-69734512769297/AnsiballZ_selinux.py'
Jan 26 19:09:36 compute-0 sudo[35816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:36 compute-0 python3.9[35818]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 26 19:09:36 compute-0 sudo[35816]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:37 compute-0 sudo[35968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjmrycmefhmrqqdoqfgrvudidwzbfjvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454577.3057346-342-217478397393529/AnsiballZ_command.py'
Jan 26 19:09:37 compute-0 sudo[35968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:37 compute-0 python3.9[35970]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 26 19:09:38 compute-0 sudo[35968]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:39 compute-0 sudo[36121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otscaqyhrrxfgbxfjnpfatixvowxpicm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454578.8985615-358-55166750894038/AnsiballZ_file.py'
Jan 26 19:09:39 compute-0 sudo[36121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:39 compute-0 python3.9[36123]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:09:39 compute-0 sudo[36121]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:40 compute-0 sudo[36273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zciqskguheiyqnmgeengrxxfdyhvzmwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454580.0656013-374-138877214722252/AnsiballZ_mount.py'
Jan 26 19:09:40 compute-0 sudo[36273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:40 compute-0 python3.9[36275]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 26 19:09:40 compute-0 sudo[36273]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:42 compute-0 sudo[36425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvdlelhgnduqrkkkpuqsnqpgmerwdblb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454582.0495992-430-192535168497531/AnsiballZ_file.py'
Jan 26 19:09:42 compute-0 sudo[36425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:42 compute-0 python3.9[36427]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:09:42 compute-0 sudo[36425]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:43 compute-0 sudo[36577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eamazujwfunggaaxaufxusbopxoyxzpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454582.730479-446-189908260444614/AnsiballZ_stat.py'
Jan 26 19:09:43 compute-0 sudo[36577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:43 compute-0 python3.9[36579]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:09:43 compute-0 sudo[36577]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:43 compute-0 sudo[36700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlvtppaclpiconbsqlxsezafxeienswo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454582.730479-446-189908260444614/AnsiballZ_copy.py'
Jan 26 19:09:43 compute-0 sudo[36700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:43 compute-0 python3.9[36702]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454582.730479-446-189908260444614/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f27e502916f2e446220b4e099d9f250733d415ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:09:43 compute-0 sudo[36700]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:44 compute-0 sudo[36852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lujniazenihvtljufjuxpurqaftycavx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454584.6098106-494-58106175072346/AnsiballZ_stat.py'
Jan 26 19:09:44 compute-0 sudo[36852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:46 compute-0 python3.9[36854]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:09:46 compute-0 sudo[36852]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:47 compute-0 sudo[37004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjvpcjchhhzsfhukrnjkhueiohewwdhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454587.0930588-510-248411550387136/AnsiballZ_command.py'
Jan 26 19:09:47 compute-0 sudo[37004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:49 compute-0 python3.9[37006]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:09:49 compute-0 sudo[37004]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:50 compute-0 sudo[37158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eabeqmfxentjqdportfxkqkzutvruapq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454590.0417902-526-189059190764667/AnsiballZ_file.py'
Jan 26 19:09:50 compute-0 sudo[37158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:50 compute-0 python3.9[37160]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:09:50 compute-0 sudo[37158]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:51 compute-0 sudo[37310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcktsxdpizufipffmstwpqetrqgxzslv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454590.9646304-548-159095401766008/AnsiballZ_getent.py'
Jan 26 19:09:51 compute-0 sudo[37310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:51 compute-0 python3.9[37312]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 26 19:09:51 compute-0 sudo[37310]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:51 compute-0 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 19:09:51 compute-0 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 19:09:52 compute-0 sudo[37464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndwmxgsdwafgewyysiuctqnztwzpzyfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454591.7576342-564-200615741980966/AnsiballZ_group.py'
Jan 26 19:09:52 compute-0 sudo[37464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:52 compute-0 python3.9[37466]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 19:09:52 compute-0 groupadd[37467]: group added to /etc/group: name=qemu, GID=107
Jan 26 19:09:52 compute-0 groupadd[37467]: group added to /etc/gshadow: name=qemu
Jan 26 19:09:52 compute-0 groupadd[37467]: new group: name=qemu, GID=107
Jan 26 19:09:52 compute-0 sudo[37464]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:53 compute-0 sudo[37622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgkdnpjzrxzfsjgugypurnsuidfuhbbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454592.7782614-580-101088723768710/AnsiballZ_user.py'
Jan 26 19:09:53 compute-0 sudo[37622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:53 compute-0 python3.9[37624]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 19:09:53 compute-0 useradd[37626]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 19:09:53 compute-0 sudo[37622]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:54 compute-0 sudo[37782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-serxxxyilcqwlkkwclpdxllntlzafffz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454593.8250372-596-117032283254304/AnsiballZ_getent.py'
Jan 26 19:09:54 compute-0 sudo[37782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:54 compute-0 python3.9[37784]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 26 19:09:54 compute-0 sudo[37782]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:54 compute-0 sudo[37935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acnercgygrmyocjghohnypkjowxpljpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454594.4879076-612-169839009360799/AnsiballZ_group.py'
Jan 26 19:09:54 compute-0 sudo[37935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:55 compute-0 python3.9[37937]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 19:09:55 compute-0 groupadd[37938]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 26 19:09:55 compute-0 groupadd[37938]: group added to /etc/gshadow: name=hugetlbfs
Jan 26 19:09:55 compute-0 groupadd[37938]: new group: name=hugetlbfs, GID=42477
Jan 26 19:09:55 compute-0 sudo[37935]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:55 compute-0 sudo[38093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckljqiwfhsfionpenyfotzvqdwcnwvsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454595.3511589-630-84398995492514/AnsiballZ_file.py'
Jan 26 19:09:55 compute-0 sudo[38093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:55 compute-0 python3.9[38095]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 26 19:09:55 compute-0 sudo[38093]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:56 compute-0 sudo[38245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srgbmgdxscazfywxtmaxadxbvlvjqtbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454596.2636123-652-178722469260221/AnsiballZ_dnf.py'
Jan 26 19:09:56 compute-0 sudo[38245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:56 compute-0 python3.9[38247]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:09:58 compute-0 sudo[38245]: pam_unix(sudo:session): session closed for user root
Jan 26 19:09:59 compute-0 sudo[38398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvmttrenomitziogrnflqybbcntdnhdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454599.0490968-668-79733538523128/AnsiballZ_file.py'
Jan 26 19:09:59 compute-0 sudo[38398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:09:59 compute-0 python3.9[38400]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:09:59 compute-0 sudo[38398]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:00 compute-0 sudo[38550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zugeoiikhsjlmrrhulacxynuauycylty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454599.7466486-684-166896403270715/AnsiballZ_stat.py'
Jan 26 19:10:00 compute-0 sudo[38550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:00 compute-0 python3.9[38552]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:10:00 compute-0 sudo[38550]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:00 compute-0 sudo[38673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsdwhfchhnwuflnfnlieyoktyrajwdso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454599.7466486-684-166896403270715/AnsiballZ_copy.py'
Jan 26 19:10:00 compute-0 sudo[38673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:00 compute-0 python3.9[38675]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769454599.7466486-684-166896403270715/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:10:00 compute-0 sudo[38673]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:01 compute-0 sudo[38825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iprcsyzntbxsrpyegxmuslgdmidivsvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454600.951043-714-171916106923701/AnsiballZ_systemd.py'
Jan 26 19:10:01 compute-0 sudo[38825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:01 compute-0 python3.9[38827]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:10:01 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 26 19:10:01 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 26 19:10:01 compute-0 kernel: Bridge firewalling registered
Jan 26 19:10:01 compute-0 systemd-modules-load[38831]: Inserted module 'br_netfilter'
Jan 26 19:10:01 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 26 19:10:01 compute-0 sudo[38825]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:02 compute-0 sudo[38984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fagxngtdolhxtgtaleanymlnjhjkkdlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454602.0816092-730-269681012765346/AnsiballZ_stat.py'
Jan 26 19:10:02 compute-0 sudo[38984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:02 compute-0 python3.9[38986]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:10:02 compute-0 sudo[38984]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:02 compute-0 sudo[39107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjllaudlcezesaonvugydoryjgqmqimq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454602.0816092-730-269681012765346/AnsiballZ_copy.py'
Jan 26 19:10:02 compute-0 sudo[39107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:03 compute-0 python3.9[39109]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769454602.0816092-730-269681012765346/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:10:03 compute-0 sudo[39107]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:03 compute-0 sudo[39259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccooauwwrcpipogotavhwkbavzceoyyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454603.492414-766-142844964650784/AnsiballZ_dnf.py'
Jan 26 19:10:03 compute-0 sudo[39259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:04 compute-0 python3.9[39261]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:10:07 compute-0 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 26 19:10:07 compute-0 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 26 19:10:07 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 19:10:07 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 26 19:10:07 compute-0 systemd[1]: Reloading.
Jan 26 19:10:07 compute-0 systemd-rc-local-generator[39325]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:10:08 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 19:10:08 compute-0 sudo[39259]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:09 compute-0 python3.9[40528]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:10:10 compute-0 python3.9[41480]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 26 19:10:10 compute-0 python3.9[42214]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:10:11 compute-0 sudo[43155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqbpvtbxxkhqkoafbykwywysvcuabzcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454611.2073317-844-278047026498943/AnsiballZ_command.py'
Jan 26 19:10:11 compute-0 sudo[43155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:11 compute-0 python3.9[43171]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:10:11 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 19:10:11 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 19:10:11 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 26 19:10:11 compute-0 systemd[1]: man-db-cache-update.service: Consumed 5.166s CPU time.
Jan 26 19:10:11 compute-0 systemd[1]: run-r0da68b947a184f978f2cf9cd206dc094.service: Deactivated successfully.
Jan 26 19:10:12 compute-0 systemd[1]: Starting Authorization Manager...
Jan 26 19:10:12 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 19:10:12 compute-0 polkitd[43651]: Started polkitd version 0.117
Jan 26 19:10:12 compute-0 polkitd[43651]: Loading rules from directory /etc/polkit-1/rules.d
Jan 26 19:10:12 compute-0 polkitd[43651]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 26 19:10:12 compute-0 polkitd[43651]: Finished loading, compiling and executing 2 rules
Jan 26 19:10:12 compute-0 systemd[1]: Started Authorization Manager.
Jan 26 19:10:12 compute-0 polkitd[43651]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 26 19:10:12 compute-0 sudo[43155]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:12 compute-0 sudo[43819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycstnpuyhwzexqaxugmjqhrokonieipl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454612.5615544-862-142698709458846/AnsiballZ_systemd.py'
Jan 26 19:10:12 compute-0 sudo[43819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:13 compute-0 python3.9[43821]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:10:13 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 26 19:10:13 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Jan 26 19:10:13 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 26 19:10:13 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 26 19:10:13 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 26 19:10:13 compute-0 sudo[43819]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:14 compute-0 python3.9[43983]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 26 19:10:17 compute-0 sudo[44133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fysfjxnbvihthswzgzvgjeehufelmatq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454616.7696133-976-101486096938326/AnsiballZ_systemd.py'
Jan 26 19:10:17 compute-0 sudo[44133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:17 compute-0 python3.9[44135]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:10:17 compute-0 systemd[1]: Reloading.
Jan 26 19:10:17 compute-0 systemd-rc-local-generator[44165]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:10:17 compute-0 sudo[44133]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:18 compute-0 sudo[44322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmshcivvencyqgtgpvdthqyfcyqcnhgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454617.7772222-976-15170255939412/AnsiballZ_systemd.py'
Jan 26 19:10:18 compute-0 sudo[44322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:18 compute-0 python3.9[44324]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:10:18 compute-0 systemd[1]: Reloading.
Jan 26 19:10:18 compute-0 systemd-rc-local-generator[44357]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:10:18 compute-0 sudo[44322]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:19 compute-0 sudo[44511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbiysinbgbhtjntpdzwprlemwqrpmyru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454619.0686686-1008-137216096834885/AnsiballZ_command.py'
Jan 26 19:10:19 compute-0 sudo[44511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:19 compute-0 python3.9[44513]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:10:19 compute-0 sudo[44511]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:20 compute-0 sudo[44664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yevjpzijvnsyrfqumupqlnzveqwadusz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454619.7947757-1024-279249827043421/AnsiballZ_command.py'
Jan 26 19:10:20 compute-0 sudo[44664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:20 compute-0 python3.9[44666]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:10:20 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 26 19:10:20 compute-0 sudo[44664]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:20 compute-0 sudo[44817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoindlmgstggyqtsfzlpqqlrjebkqjxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454620.5739617-1040-73910895127266/AnsiballZ_command.py'
Jan 26 19:10:20 compute-0 sudo[44817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:21 compute-0 python3.9[44819]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:10:22 compute-0 sudo[44817]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:23 compute-0 sudo[44979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgyvyxfbguitxendlyvzfmjifnnoyjbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454622.7374523-1056-87907025228891/AnsiballZ_command.py'
Jan 26 19:10:23 compute-0 sudo[44979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:23 compute-0 python3.9[44981]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:10:23 compute-0 sudo[44979]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:23 compute-0 sudo[45132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldudqnnuyprxsmmvlvllftfbgbhclcyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454623.433327-1072-21074058106634/AnsiballZ_systemd.py'
Jan 26 19:10:23 compute-0 sudo[45132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:24 compute-0 python3.9[45134]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:10:24 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 26 19:10:24 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Jan 26 19:10:24 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Jan 26 19:10:24 compute-0 systemd[1]: Starting Apply Kernel Variables...
Jan 26 19:10:24 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 26 19:10:24 compute-0 systemd[1]: Finished Apply Kernel Variables.
Jan 26 19:10:24 compute-0 sudo[45132]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:24 compute-0 sshd-session[31556]: Connection closed by 192.168.122.30 port 56010
Jan 26 19:10:24 compute-0 sshd-session[31553]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:10:24 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Jan 26 19:10:24 compute-0 systemd[1]: session-10.scope: Consumed 2min 19.287s CPU time.
Jan 26 19:10:24 compute-0 systemd-logind[794]: Session 10 logged out. Waiting for processes to exit.
Jan 26 19:10:24 compute-0 systemd-logind[794]: Removed session 10.
Jan 26 19:10:30 compute-0 sshd-session[45164]: Accepted publickey for zuul from 192.168.122.30 port 56438 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:10:30 compute-0 systemd-logind[794]: New session 11 of user zuul.
Jan 26 19:10:30 compute-0 systemd[1]: Started Session 11 of User zuul.
Jan 26 19:10:30 compute-0 sshd-session[45164]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:10:31 compute-0 python3.9[45317]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:10:32 compute-0 python3.9[45471]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:10:33 compute-0 sudo[45625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrbbrnyltxrnbrkcmjstesqomkqjxmfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454633.3680797-75-237565954542171/AnsiballZ_command.py'
Jan 26 19:10:33 compute-0 sudo[45625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:33 compute-0 python3.9[45627]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:10:33 compute-0 sudo[45625]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:34 compute-0 python3.9[45778]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:10:35 compute-0 sudo[45932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdoiiraljingjrotilambjvfecqhgvcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454635.3795395-115-181859949705154/AnsiballZ_setup.py'
Jan 26 19:10:35 compute-0 sudo[45932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:35 compute-0 python3.9[45934]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:10:36 compute-0 sudo[45932]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:36 compute-0 sudo[46016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqftjmogftborutcqqgobftqhfshyhly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454635.3795395-115-181859949705154/AnsiballZ_dnf.py'
Jan 26 19:10:36 compute-0 sudo[46016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:36 compute-0 python3.9[46018]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:10:38 compute-0 sudo[46016]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:38 compute-0 sudo[46169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzlgxrdacdiljivqcywbtetvmuhsvjmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454638.2571363-139-161898552932775/AnsiballZ_setup.py'
Jan 26 19:10:38 compute-0 sudo[46169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:38 compute-0 python3.9[46171]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:10:39 compute-0 sudo[46169]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:39 compute-0 sudo[46340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koxmolezbdswkuiqqtrcyxxcarowyydu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454639.3338308-161-96503653627394/AnsiballZ_file.py'
Jan 26 19:10:39 compute-0 sudo[46340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:39 compute-0 python3.9[46342]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:10:40 compute-0 sudo[46340]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:40 compute-0 sudo[46492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiyjaibozcpqnpnvlakpzwrcxzckvmtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454640.204061-177-76614947109957/AnsiballZ_command.py'
Jan 26 19:10:40 compute-0 sudo[46492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:40 compute-0 python3.9[46494]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:10:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3097731092-merged.mount: Deactivated successfully.
Jan 26 19:10:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck2286780003-merged.mount: Deactivated successfully.
Jan 26 19:10:40 compute-0 podman[46495]: 2026-01-26 19:10:40.779902256 +0000 UTC m=+0.063072003 system refresh
Jan 26 19:10:40 compute-0 sudo[46492]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:41 compute-0 sudo[46654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqdgpxzveensemrcvwtxzlcflxnxamkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454641.0453255-193-226130077332756/AnsiballZ_stat.py'
Jan 26 19:10:41 compute-0 sudo[46654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:10:41 compute-0 python3.9[46656]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:10:41 compute-0 sudo[46654]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:42 compute-0 sudo[46777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvjollfzzkwujuvaonfmmqhvpafkvohr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454641.0453255-193-226130077332756/AnsiballZ_copy.py'
Jan 26 19:10:42 compute-0 sudo[46777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:42 compute-0 python3.9[46779]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454641.0453255-193-226130077332756/.source.json follow=False _original_basename=podman_network_config.j2 checksum=92744710ab0659111d16a9e8da86f2645594c853 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:10:42 compute-0 sudo[46777]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:43 compute-0 sudo[46929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odusjnjdjfybxjhiuhfjwrpckkmcbfwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454642.8005543-223-260118180987364/AnsiballZ_stat.py'
Jan 26 19:10:43 compute-0 sudo[46929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:43 compute-0 python3.9[46931]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:10:43 compute-0 sudo[46929]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:43 compute-0 sudo[47052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyqlkppknoakyxpyachvxfuckqzdnghf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454642.8005543-223-260118180987364/AnsiballZ_copy.py'
Jan 26 19:10:43 compute-0 sudo[47052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:43 compute-0 python3.9[47054]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769454642.8005543-223-260118180987364/.source.conf follow=False _original_basename=registries.conf.j2 checksum=6840789ba347f1e38924c850b193031239bba299 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:10:44 compute-0 sudo[47052]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:44 compute-0 sudo[47204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbnucawyugmccbafcdgwqkkeehdlbfym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454644.2224417-255-124865377495244/AnsiballZ_ini_file.py'
Jan 26 19:10:44 compute-0 sudo[47204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:44 compute-0 python3.9[47206]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:10:44 compute-0 sudo[47204]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:45 compute-0 sudo[47356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txwixtblabeclxuomwzwtudkgdbgzgfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454645.052544-255-112933445547829/AnsiballZ_ini_file.py'
Jan 26 19:10:45 compute-0 sudo[47356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:45 compute-0 python3.9[47358]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:10:45 compute-0 sudo[47356]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:46 compute-0 sudo[47508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifhnftxagmstsrzudjdlkzviimfdyngz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454645.7392924-255-111515956301017/AnsiballZ_ini_file.py'
Jan 26 19:10:46 compute-0 sudo[47508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:46 compute-0 python3.9[47510]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:10:46 compute-0 sudo[47508]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:46 compute-0 sudo[47660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyugbfgebztdtztloebhegesqwylqahk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454646.3561695-255-186078673958291/AnsiballZ_ini_file.py'
Jan 26 19:10:46 compute-0 sudo[47660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:46 compute-0 python3.9[47662]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:10:46 compute-0 sudo[47660]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:47 compute-0 python3.9[47812]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:10:48 compute-0 sudo[47964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfpuawvmrqlpabnbsvxqlbxhothggwps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454648.0608141-335-243650208565788/AnsiballZ_dnf.py'
Jan 26 19:10:48 compute-0 sudo[47964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:48 compute-0 python3.9[47966]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 19:10:49 compute-0 sudo[47964]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:50 compute-0 sudo[48117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsajiimdkynadryrzoshntqgyadebkos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454650.0086322-351-262932839822356/AnsiballZ_dnf.py'
Jan 26 19:10:50 compute-0 sudo[48117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:50 compute-0 python3.9[48119]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 19:10:52 compute-0 sudo[48117]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:52 compute-0 sudo[48278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xshnakceyjnfdlzqghqxpxzrivnpfwax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454652.5642476-371-170445736068967/AnsiballZ_dnf.py'
Jan 26 19:10:52 compute-0 sudo[48278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:53 compute-0 python3.9[48280]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 19:10:54 compute-0 sudo[48278]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:55 compute-0 sudo[48431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyxynxlqspxgojzvoryclakuacdymklt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454654.8019152-389-132309907664933/AnsiballZ_dnf.py'
Jan 26 19:10:55 compute-0 sudo[48431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:55 compute-0 python3.9[48433]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 19:10:56 compute-0 sudo[48431]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:57 compute-0 sudo[48584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldmlspqqpenfzdaqexnqwndavkwfqsmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454657.150906-411-106576675295709/AnsiballZ_dnf.py'
Jan 26 19:10:57 compute-0 sudo[48584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:10:57 compute-0 python3.9[48586]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 19:10:59 compute-0 sudo[48584]: pam_unix(sudo:session): session closed for user root
Jan 26 19:10:59 compute-0 sudo[48740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tovaptbvsxkxrbvmvnrdgrhjkkdqlvpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454659.5751314-427-114714105172847/AnsiballZ_dnf.py'
Jan 26 19:10:59 compute-0 sudo[48740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:11:00 compute-0 python3.9[48742]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 19:11:03 compute-0 sudo[48740]: pam_unix(sudo:session): session closed for user root
Jan 26 19:11:04 compute-0 sudo[48909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byztophfdcpwagdrvphvvtpzhjngghen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454664.4782598-445-33156468928278/AnsiballZ_dnf.py'
Jan 26 19:11:04 compute-0 sudo[48909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:11:05 compute-0 python3.9[48911]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 19:11:06 compute-0 sudo[48909]: pam_unix(sudo:session): session closed for user root
Jan 26 19:11:06 compute-0 sudo[49062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebtilovdnbpigaatqyahvoecslbzscsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454666.5402253-463-226471979694859/AnsiballZ_dnf.py'
Jan 26 19:11:06 compute-0 sudo[49062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:11:07 compute-0 python3.9[49064]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 19:11:19 compute-0 sudo[49062]: pam_unix(sudo:session): session closed for user root
Jan 26 19:11:19 compute-0 sudo[49399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhrqbhqqguxqndhssskduilehqyikiwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454679.5363066-481-139924805026166/AnsiballZ_dnf.py'
Jan 26 19:11:19 compute-0 sudo[49399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:11:20 compute-0 python3.9[49401]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 19:11:21 compute-0 sudo[49399]: pam_unix(sudo:session): session closed for user root
Jan 26 19:11:22 compute-0 sudo[49555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yewaziygsupaatujnphxktweziuizewa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454681.654942-501-154304469675195/AnsiballZ_dnf.py'
Jan 26 19:11:22 compute-0 sudo[49555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:11:22 compute-0 python3.9[49557]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 19:11:23 compute-0 sudo[49555]: pam_unix(sudo:session): session closed for user root
Jan 26 19:11:24 compute-0 sudo[49712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhmmmygeqjauulemglkzunysvvaorego ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454684.2389538-523-265721357700090/AnsiballZ_file.py'
Jan 26 19:11:24 compute-0 sudo[49712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:11:24 compute-0 python3.9[49714]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:11:24 compute-0 sudo[49712]: pam_unix(sudo:session): session closed for user root
Jan 26 19:11:25 compute-0 sudo[49887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maxvwczmyakgbmwxfllfgdikwrjylleq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454684.9240708-539-273247784293322/AnsiballZ_stat.py'
Jan 26 19:11:25 compute-0 sudo[49887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:11:25 compute-0 python3.9[49889]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:11:25 compute-0 sudo[49887]: pam_unix(sudo:session): session closed for user root
Jan 26 19:11:25 compute-0 sudo[50010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sarwgiaqllsmozmhynxmsvevbmhpkrtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454684.9240708-539-273247784293322/AnsiballZ_copy.py'
Jan 26 19:11:25 compute-0 sudo[50010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:11:26 compute-0 python3.9[50012]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769454684.9240708-539-273247784293322/.source.json _original_basename=.c11dv5h5 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:11:26 compute-0 sudo[50010]: pam_unix(sudo:session): session closed for user root
Jan 26 19:11:26 compute-0 sudo[50162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baoeyrgnualslgwrnjvpxloxwpooeyot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454686.4031634-575-26187532871038/AnsiballZ_podman_image.py'
Jan 26 19:11:26 compute-0 sudo[50162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:11:27 compute-0 python3.9[50164]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 19:11:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:11:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat863456530-lower\x2dmapped.mount: Deactivated successfully.
Jan 26 19:11:36 compute-0 podman[50176]: 2026-01-26 19:11:36.498276722 +0000 UTC m=+9.245549922 image pull 241d2c1ab738336a495a3844d8edb58bb1ca6339db3c90d7e6fb4b3656492432 38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Jan 26 19:11:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:11:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:11:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:11:36 compute-0 sudo[50162]: pam_unix(sudo:session): session closed for user root
Jan 26 19:11:37 compute-0 sudo[50473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqrzfrfwaspkybqkofbvyoyhegkhexiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454697.0300524-597-109659637979335/AnsiballZ_podman_image.py'
Jan 26 19:11:37 compute-0 sudo[50473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:11:37 compute-0 python3.9[50475]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 19:11:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:11:46 compute-0 podman[50486]: 2026-01-26 19:11:46.928394958 +0000 UTC m=+9.355282771 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 19:11:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:11:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:11:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:11:47 compute-0 sudo[50473]: pam_unix(sudo:session): session closed for user root
Jan 26 19:11:47 compute-0 sudo[50792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fowvzrjhutqxkmifpiddszimzcpprcdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454707.5404544-617-171612827661225/AnsiballZ_podman_image.py'
Jan 26 19:11:47 compute-0 sudo[50792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:11:48 compute-0 python3.9[50794]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 19:11:48 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:12:04 compute-0 podman[50806]: 2026-01-26 19:12:04.631638472 +0000 UTC m=+16.547189844 image pull 00a1d0493134435a0b50f81676478a7bc2e0126d0e30cb65072b8884b766f13f 38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Jan 26 19:12:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:12:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:12:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:12:04 compute-0 sudo[50792]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:05 compute-0 sudo[51060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvensibvoxquzewbglztzcquatysjflt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454725.2009747-639-260487965841480/AnsiballZ_podman_image.py'
Jan 26 19:12:05 compute-0 sudo[51060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:05 compute-0 python3.9[51062]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.223:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 19:12:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:12:09 compute-0 podman[51076]: 2026-01-26 19:12:09.004408003 +0000 UTC m=+3.203083981 image pull e62ff214fcbd3a3e083b358fe3feadc3fde0aa5783c188085fb2362542b5f99d 38.102.83.223:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest
Jan 26 19:12:09 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:12:09 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:12:09 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:12:09 compute-0 sudo[51060]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:09 compute-0 sudo[51331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwkcrmowsoydxnlhlzuqjjbpjtqzmcxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454729.4185288-639-64320316709690/AnsiballZ_podman_image.py'
Jan 26 19:12:09 compute-0 sudo[51331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:09 compute-0 python3.9[51333]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 26 19:12:10 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:12:11 compute-0 podman[51345]: 2026-01-26 19:12:11.221417624 +0000 UTC m=+1.144510238 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 26 19:12:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:12:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:12:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:12:11 compute-0 sudo[51331]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:12 compute-0 sshd-session[45167]: Connection closed by 192.168.122.30 port 56438
Jan 26 19:12:12 compute-0 sshd-session[45164]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:12:12 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Jan 26 19:12:12 compute-0 systemd[1]: session-11.scope: Consumed 2min 5.697s CPU time.
Jan 26 19:12:12 compute-0 systemd-logind[794]: Session 11 logged out. Waiting for processes to exit.
Jan 26 19:12:12 compute-0 systemd-logind[794]: Removed session 11.
Jan 26 19:12:17 compute-0 sshd-session[51492]: Accepted publickey for zuul from 192.168.122.30 port 38546 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:12:17 compute-0 systemd-logind[794]: New session 12 of user zuul.
Jan 26 19:12:17 compute-0 systemd[1]: Started Session 12 of User zuul.
Jan 26 19:12:17 compute-0 sshd-session[51492]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:12:18 compute-0 python3.9[51645]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:12:19 compute-0 sudo[51799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjtmenqqxlegemrqjjceuvcqsvwruvel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454739.3635116-47-267732575901425/AnsiballZ_getent.py'
Jan 26 19:12:19 compute-0 sudo[51799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:20 compute-0 python3.9[51801]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 26 19:12:20 compute-0 sudo[51799]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:20 compute-0 sudo[51952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzygogvegcxhkeiwdbwndnwkvsbqknin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454740.3439052-63-256047620827030/AnsiballZ_group.py'
Jan 26 19:12:20 compute-0 sudo[51952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:21 compute-0 python3.9[51954]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 19:12:21 compute-0 groupadd[51955]: group added to /etc/group: name=openvswitch, GID=42476
Jan 26 19:12:21 compute-0 groupadd[51955]: group added to /etc/gshadow: name=openvswitch
Jan 26 19:12:21 compute-0 groupadd[51955]: new group: name=openvswitch, GID=42476
Jan 26 19:12:21 compute-0 sudo[51952]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:21 compute-0 sudo[52110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpcendgmxxazdplwzqqeirohfndimyke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454741.335868-79-266237529847286/AnsiballZ_user.py'
Jan 26 19:12:21 compute-0 sudo[52110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:22 compute-0 python3.9[52112]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 19:12:22 compute-0 useradd[52114]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 19:12:22 compute-0 useradd[52114]: add 'openvswitch' to group 'hugetlbfs'
Jan 26 19:12:22 compute-0 useradd[52114]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 26 19:12:22 compute-0 sudo[52110]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:22 compute-0 sudo[52270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfdanygsvhylajmhyvilyptbphglfmoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454742.4490678-99-15465802262794/AnsiballZ_setup.py'
Jan 26 19:12:22 compute-0 sudo[52270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:23 compute-0 python3.9[52272]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:12:23 compute-0 sudo[52270]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:23 compute-0 sudo[52354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwrewxhscgwuwhtsaoiqxspoypgxzapa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454742.4490678-99-15465802262794/AnsiballZ_dnf.py'
Jan 26 19:12:23 compute-0 sudo[52354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:23 compute-0 python3.9[52356]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 19:12:25 compute-0 sudo[52354]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:26 compute-0 sudo[52515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbviyrazdvgxhgysovrganacrcvdymjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454746.1719136-127-121254466018209/AnsiballZ_dnf.py'
Jan 26 19:12:26 compute-0 sudo[52515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:26 compute-0 python3.9[52517]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:12:39 compute-0 kernel: SELinux:  Converting 2738 SID table entries...
Jan 26 19:12:39 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 19:12:39 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 26 19:12:39 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 19:12:39 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 26 19:12:39 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 19:12:39 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 19:12:39 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 19:12:39 compute-0 groupadd[52543]: group added to /etc/group: name=unbound, GID=994
Jan 26 19:12:39 compute-0 groupadd[52543]: group added to /etc/gshadow: name=unbound
Jan 26 19:12:39 compute-0 groupadd[52543]: new group: name=unbound, GID=994
Jan 26 19:12:39 compute-0 useradd[52550]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 26 19:12:40 compute-0 sshd-session[52537]: Invalid user admin from 193.32.162.151 port 33356
Jan 26 19:12:40 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 26 19:12:40 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 26 19:12:40 compute-0 sshd-session[52537]: Connection closed by invalid user admin 193.32.162.151 port 33356 [preauth]
Jan 26 19:12:41 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 19:12:41 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 26 19:12:41 compute-0 systemd[1]: Reloading.
Jan 26 19:12:41 compute-0 systemd-rc-local-generator[53050]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:12:41 compute-0 systemd-sysv-generator[53054]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:12:41 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 19:12:42 compute-0 sudo[52515]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:42 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 19:12:42 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 26 19:12:42 compute-0 systemd[1]: run-r6723a8baa1314ba3a158acb539ca591a.service: Deactivated successfully.
Jan 26 19:12:43 compute-0 sudo[53617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kowxtyrowdghljrgnpovhguwcocuzqna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454762.6125097-143-276549964287184/AnsiballZ_systemd.py'
Jan 26 19:12:43 compute-0 sudo[53617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:43 compute-0 python3.9[53619]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 19:12:43 compute-0 systemd[1]: Reloading.
Jan 26 19:12:43 compute-0 systemd-sysv-generator[53651]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:12:43 compute-0 systemd-rc-local-generator[53645]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:12:43 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Jan 26 19:12:43 compute-0 chown[53661]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 26 19:12:44 compute-0 ovs-ctl[53666]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 26 19:12:44 compute-0 ovs-ctl[53666]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 26 19:12:44 compute-0 ovs-ctl[53666]: Starting ovsdb-server [  OK  ]
Jan 26 19:12:44 compute-0 ovs-vsctl[53715]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 26 19:12:44 compute-0 ovs-vsctl[53735]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"4b7fe4ab-0aa1-433c-a7da-fec1fee5732c\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 26 19:12:44 compute-0 ovs-ctl[53666]: Configuring Open vSwitch system IDs [  OK  ]
Jan 26 19:12:44 compute-0 ovs-vsctl[53741]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 26 19:12:44 compute-0 ovs-ctl[53666]: Enabling remote OVSDB managers [  OK  ]
Jan 26 19:12:44 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Jan 26 19:12:44 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 26 19:12:44 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 26 19:12:44 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 26 19:12:44 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Jan 26 19:12:44 compute-0 ovs-ctl[53786]: Inserting openvswitch module [  OK  ]
Jan 26 19:12:44 compute-0 ovs-ctl[53755]: Starting ovs-vswitchd [  OK  ]
Jan 26 19:12:44 compute-0 ovs-ctl[53755]: Enabling remote OVSDB managers [  OK  ]
Jan 26 19:12:44 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 26 19:12:44 compute-0 ovs-vsctl[53807]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 26 19:12:44 compute-0 systemd[1]: Starting Open vSwitch...
Jan 26 19:12:44 compute-0 systemd[1]: Finished Open vSwitch.
Jan 26 19:12:44 compute-0 sudo[53617]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:45 compute-0 python3.9[53958]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:12:46 compute-0 sudo[54108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trtqrzhlhhgdmdlhjvniwzhiuvejtiso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454765.860378-179-141171287308554/AnsiballZ_sefcontext.py'
Jan 26 19:12:46 compute-0 sudo[54108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:46 compute-0 python3.9[54110]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 26 19:12:47 compute-0 kernel: SELinux:  Converting 2752 SID table entries...
Jan 26 19:12:47 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 19:12:47 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 26 19:12:47 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 19:12:47 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 26 19:12:47 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 19:12:47 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 19:12:47 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 19:12:48 compute-0 sudo[54108]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:48 compute-0 python3.9[54265]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:12:49 compute-0 sudo[54421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqmpijueethrftajmfaugmkvepdzyinh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454769.431286-215-35850701831438/AnsiballZ_dnf.py'
Jan 26 19:12:49 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 26 19:12:49 compute-0 sudo[54421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:49 compute-0 python3.9[54423]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:12:51 compute-0 sudo[54421]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:52 compute-0 sudo[54574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkogjpcrujsibuopfkmgrwvhipinsbcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454772.150207-231-22651287207217/AnsiballZ_command.py'
Jan 26 19:12:52 compute-0 sudo[54574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:52 compute-0 python3.9[54576]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:12:53 compute-0 sudo[54574]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:54 compute-0 sudo[54861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amdvlismtacophvvpnuerhvwrkzxqtpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454773.7194793-247-18367717917708/AnsiballZ_file.py'
Jan 26 19:12:54 compute-0 sudo[54861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:54 compute-0 python3.9[54863]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 26 19:12:54 compute-0 sudo[54861]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:55 compute-0 python3.9[55013]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:12:55 compute-0 sudo[55165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubrhvzepwqxibnjwpclsdkrbtlyntoit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454775.5087168-279-263412896630199/AnsiballZ_dnf.py'
Jan 26 19:12:55 compute-0 sudo[55165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:12:56 compute-0 python3.9[55167]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:12:57 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 19:12:57 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 26 19:12:57 compute-0 systemd[1]: Reloading.
Jan 26 19:12:57 compute-0 systemd-rc-local-generator[55205]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:12:57 compute-0 systemd-sysv-generator[55208]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:12:57 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 19:12:58 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 19:12:58 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 26 19:12:58 compute-0 systemd[1]: run-r9f2cd2639d9f44adb9a90b7bd695350b.service: Deactivated successfully.
Jan 26 19:12:58 compute-0 sudo[55165]: pam_unix(sudo:session): session closed for user root
Jan 26 19:12:59 compute-0 sudo[55482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjehhsqespnbpzkjtmymucvunuvsmfsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454779.5389144-295-176812516750531/AnsiballZ_systemd.py'
Jan 26 19:12:59 compute-0 sudo[55482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:00 compute-0 python3.9[55484]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:13:00 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 26 19:13:00 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Jan 26 19:13:00 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Jan 26 19:13:00 compute-0 NetworkManager[7190]: <info>  [1769454780.2653] caught SIGTERM, shutting down normally.
Jan 26 19:13:00 compute-0 systemd[1]: Stopping Network Manager...
Jan 26 19:13:00 compute-0 NetworkManager[7190]: <info>  [1769454780.2668] dhcp4 (eth0): canceled DHCP transaction
Jan 26 19:13:00 compute-0 NetworkManager[7190]: <info>  [1769454780.2669] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 19:13:00 compute-0 NetworkManager[7190]: <info>  [1769454780.2669] dhcp4 (eth0): state changed no lease
Jan 26 19:13:00 compute-0 NetworkManager[7190]: <info>  [1769454780.2672] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 19:13:00 compute-0 NetworkManager[7190]: <info>  [1769454780.2760] exiting (success)
Jan 26 19:13:00 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 19:13:00 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 26 19:13:00 compute-0 systemd[1]: Stopped Network Manager.
Jan 26 19:13:00 compute-0 systemd[1]: NetworkManager.service: Consumed 14.423s CPU time, 4.1M memory peak, read 0B from disk, written 15.0K to disk.
Jan 26 19:13:00 compute-0 systemd[1]: Starting Network Manager...
Jan 26 19:13:00 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.3239] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:482de948-a14a-4a06-a160-ce1b2a745f1c)
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.3242] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.3300] manager[0x5634066ae000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 26 19:13:00 compute-0 systemd[1]: Starting Hostname Service...
Jan 26 19:13:00 compute-0 systemd[1]: Started Hostname Service.
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4576] hostname: hostname: using hostnamed
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4576] hostname: static hostname changed from (none) to "compute-0"
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4581] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4586] manager[0x5634066ae000]: rfkill: Wi-Fi hardware radio set enabled
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4588] manager[0x5634066ae000]: rfkill: WWAN hardware radio set enabled
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4613] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4623] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4624] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4624] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4625] manager: Networking is enabled by state file
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4627] settings: Loaded settings plugin: keyfile (internal)
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4631] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4660] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4670] dhcp: init: Using DHCP client 'internal'
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4674] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4679] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4684] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4691] device (lo): Activation: starting connection 'lo' (4396d3e2-241a-4088-b481-db553b6a2730)
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4698] device (eth0): carrier: link connected
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4703] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4707] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4708] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4713] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4719] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4724] device (eth1): carrier: link connected
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4728] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4733] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (aba96d01-0b38-5a28-a9c5-4517183729d8) (indicated)
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4733] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4738] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4745] device (eth1): Activation: starting connection 'ci-private-network' (aba96d01-0b38-5a28-a9c5-4517183729d8)
Jan 26 19:13:00 compute-0 systemd[1]: Started Network Manager.
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4751] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4758] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4760] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4762] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4764] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4767] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4768] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4770] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4774] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4779] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4782] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4799] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4815] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4825] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4829] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4835] device (lo): Activation: successful, device activated.
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4846] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4849] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4854] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.4856] device (eth1): Activation: successful, device activated.
Jan 26 19:13:00 compute-0 systemd[1]: Starting Network Manager Wait Online...
Jan 26 19:13:00 compute-0 sudo[55482]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.5271] dhcp4 (eth0): state changed new lease, address=38.102.83.58
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.5277] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.5336] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.5358] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.5359] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.5362] manager: NetworkManager state is now CONNECTED_SITE
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.5365] device (eth0): Activation: successful, device activated.
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.5372] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 26 19:13:00 compute-0 NetworkManager[55489]: <info>  [1769454780.5375] manager: startup complete
Jan 26 19:13:00 compute-0 systemd[1]: Finished Network Manager Wait Online.
Jan 26 19:13:01 compute-0 sudo[55708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojdgkmaywnwtdjeqbjuzxufcjngdgasw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454780.7102258-311-201315782120985/AnsiballZ_dnf.py'
Jan 26 19:13:01 compute-0 sudo[55708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:01 compute-0 python3.9[55710]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:13:05 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 19:13:05 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 26 19:13:05 compute-0 systemd[1]: Reloading.
Jan 26 19:13:05 compute-0 systemd-rc-local-generator[55762]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:13:05 compute-0 systemd-sysv-generator[55765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:13:06 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 19:13:06 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 19:13:06 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 26 19:13:06 compute-0 systemd[1]: run-ra4b5835730be4aff916896abbb496d67.service: Deactivated successfully.
Jan 26 19:13:07 compute-0 sudo[55708]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:10 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 19:13:10 compute-0 sudo[56167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrbwlglstydguxgibmreqgileknnlafn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454790.568863-335-163549693398186/AnsiballZ_stat.py'
Jan 26 19:13:10 compute-0 sudo[56167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:11 compute-0 python3.9[56169]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:13:11 compute-0 sudo[56167]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:11 compute-0 sudo[56319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aphsvfbnxifwlgaxtbcshfiuzpzcbtjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454791.454481-353-277834742306017/AnsiballZ_ini_file.py'
Jan 26 19:13:11 compute-0 sudo[56319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:12 compute-0 python3.9[56321]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:13:12 compute-0 sudo[56319]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:12 compute-0 sudo[56473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxeyhuedqaevmxltnodftrotcsekbcfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454792.4806802-373-263389371708796/AnsiballZ_ini_file.py'
Jan 26 19:13:12 compute-0 sudo[56473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:12 compute-0 python3.9[56475]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:13:12 compute-0 sudo[56473]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:13 compute-0 sudo[56625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxhodhsjhfkewctbjuvuoeqrpdhjvzxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454793.188534-373-178362723994991/AnsiballZ_ini_file.py'
Jan 26 19:13:13 compute-0 sudo[56625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:13 compute-0 python3.9[56627]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:13:13 compute-0 sudo[56625]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:14 compute-0 sudo[56777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itzgwhgsqzvtdevxvzuweyatrlrdcjbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454794.055611-403-85714429921031/AnsiballZ_ini_file.py'
Jan 26 19:13:14 compute-0 sudo[56777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:14 compute-0 python3.9[56779]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:13:14 compute-0 sudo[56777]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:15 compute-0 sudo[56929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tayfdavbhcbcdejrkixnpufravzkwucw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454794.93564-403-41663487309435/AnsiballZ_ini_file.py'
Jan 26 19:13:15 compute-0 sudo[56929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:15 compute-0 python3.9[56931]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:13:15 compute-0 sudo[56929]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:16 compute-0 sudo[57081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylcmxzxvpqsrkvbaitjkyzfrhwqbmghc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454795.7402256-433-236536930026085/AnsiballZ_stat.py'
Jan 26 19:13:16 compute-0 sudo[57081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:16 compute-0 python3.9[57083]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:13:16 compute-0 sudo[57081]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:16 compute-0 sudo[57204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xztsjloubvxmcjzegtrqmndwciqwgiyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454795.7402256-433-236536930026085/AnsiballZ_copy.py'
Jan 26 19:13:16 compute-0 sudo[57204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:17 compute-0 python3.9[57206]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769454795.7402256-433-236536930026085/.source _original_basename=.vrglixsr follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:13:17 compute-0 sudo[57204]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:17 compute-0 sudo[57356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yolxklcuhozajoaguprofvhhrzxdxrhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454797.2151022-463-7305977704212/AnsiballZ_file.py'
Jan 26 19:13:17 compute-0 sudo[57356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:17 compute-0 python3.9[57358]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:13:17 compute-0 sudo[57356]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:18 compute-0 sudo[57508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irrszncjfxtjmhmggufofjisdhumklwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454797.9597595-479-204363109690956/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 26 19:13:18 compute-0 sudo[57508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:18 compute-0 python3.9[57510]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 26 19:13:18 compute-0 sudo[57508]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:19 compute-0 sudo[57660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inwsibzxuzioumyllibfjvwbgsvpnppu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454798.8563395-497-44284106889987/AnsiballZ_file.py'
Jan 26 19:13:19 compute-0 sudo[57660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:19 compute-0 python3.9[57662]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:13:19 compute-0 sudo[57660]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:20 compute-0 sudo[57812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvvuolctyvgwlhcwnhuegnmujylmpity ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454799.785152-517-175285717815914/AnsiballZ_stat.py'
Jan 26 19:13:20 compute-0 sudo[57812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:20 compute-0 sudo[57812]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:20 compute-0 sudo[57935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bleqvkvtbyqgywregkdlitxmzbpdphum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454799.785152-517-175285717815914/AnsiballZ_copy.py'
Jan 26 19:13:20 compute-0 sudo[57935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:20 compute-0 sudo[57935]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:21 compute-0 sudo[58087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsoyqdcdhnyeedspgjujjkjhmrhauikc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454801.169688-547-170466812259689/AnsiballZ_slurp.py'
Jan 26 19:13:21 compute-0 sudo[58087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:21 compute-0 python3.9[58089]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 26 19:13:21 compute-0 sudo[58087]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:22 compute-0 sudo[58262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcvugavccwubxvwhvxycmpbptjbcfujo ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454802.1325629-565-228539117379453/async_wrapper.py j27497534321 300 /home/zuul/.ansible/tmp/ansible-tmp-1769454802.1325629-565-228539117379453/AnsiballZ_edpm_os_net_config.py _'
Jan 26 19:13:22 compute-0 sudo[58262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:23 compute-0 ansible-async_wrapper.py[58264]: Invoked with j27497534321 300 /home/zuul/.ansible/tmp/ansible-tmp-1769454802.1325629-565-228539117379453/AnsiballZ_edpm_os_net_config.py _
Jan 26 19:13:23 compute-0 ansible-async_wrapper.py[58267]: Starting module and watcher
Jan 26 19:13:23 compute-0 ansible-async_wrapper.py[58267]: Start watching 58268 (300)
Jan 26 19:13:23 compute-0 ansible-async_wrapper.py[58268]: Start module (58268)
Jan 26 19:13:23 compute-0 ansible-async_wrapper.py[58264]: Return async_wrapper task started.
Jan 26 19:13:23 compute-0 sudo[58262]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:23 compute-0 python3.9[58269]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 26 19:13:23 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 26 19:13:23 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 26 19:13:23 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 26 19:13:23 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 26 19:13:23 compute-0 kernel: cfg80211: failed to load regulatory.db
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.4345] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.4374] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5311] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5314] audit: op="connection-add" uuid="3b03fd6c-fd24-4ffe-9212-00cf32482918" name="br-ex-br" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5336] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5338] audit: op="connection-add" uuid="403c8a51-830e-401c-b7ff-2ba4b348e50f" name="br-ex-port" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5356] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5359] audit: op="connection-add" uuid="a7734e75-53ee-4792-bf41-3d3f2d056522" name="eth1-port" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5376] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5378] audit: op="connection-add" uuid="8dd891e9-70aa-4705-848b-697adfe9a36c" name="vlan20-port" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5393] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5395] audit: op="connection-add" uuid="83636716-6860-4a20-8a47-d61a0d1d9508" name="vlan21-port" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5410] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5413] audit: op="connection-add" uuid="b3b76f37-ffa0-43bb-8c0f-e5d758258246" name="vlan22-port" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5443] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5468] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5471] audit: op="connection-add" uuid="779ac92e-0f10-4136-b1e0-2f57e46c311f" name="br-ex-if" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5733] audit: op="connection-update" uuid="aba96d01-0b38-5a28-a9c5-4517183729d8" name="ci-private-network" args="connection.controller,connection.master,connection.port-type,connection.slave-type,connection.timestamp,ovs-external-ids.data,ipv4.routes,ipv4.addresses,ipv4.routing-rules,ipv4.dns,ipv4.method,ipv4.never-default,ipv6.routes,ipv6.addresses,ipv6.routing-rules,ipv6.dns,ipv6.addr-gen-mode,ipv6.method,ovs-interface.type" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5757] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5760] audit: op="connection-add" uuid="edf6e5d5-cf39-40d5-ba6e-7affcb1d4bb1" name="vlan20-if" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5784] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5787] audit: op="connection-add" uuid="3248c519-e8ac-4138-96a3-845c9cb4468f" name="vlan21-if" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5810] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5813] audit: op="connection-add" uuid="91be9deb-2b01-49d4-8963-1a1c64595d55" name="vlan22-if" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5834] audit: op="connection-delete" uuid="66289fac-35b9-3051-8520-6ca87ed17b69" name="Wired connection 1" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5855] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <warn>  [1769454805.5860] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5869] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5874] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (3b03fd6c-fd24-4ffe-9212-00cf32482918)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5875] audit: op="connection-activate" uuid="3b03fd6c-fd24-4ffe-9212-00cf32482918" name="br-ex-br" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5878] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <warn>  [1769454805.5880] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5888] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5893] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (403c8a51-830e-401c-b7ff-2ba4b348e50f)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5895] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <warn>  [1769454805.5897] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5902] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5906] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (a7734e75-53ee-4792-bf41-3d3f2d056522)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5908] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <warn>  [1769454805.5910] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5915] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5920] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (8dd891e9-70aa-4705-848b-697adfe9a36c)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5922] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <warn>  [1769454805.5924] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5930] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5935] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (83636716-6860-4a20-8a47-d61a0d1d9508)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5937] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <warn>  [1769454805.5939] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5945] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5950] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (b3b76f37-ffa0-43bb-8c0f-e5d758258246)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5952] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5955] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5957] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5965] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <warn>  [1769454805.5967] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5971] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5975] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (779ac92e-0f10-4136-b1e0-2f57e46c311f)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5977] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5982] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5985] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5987] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.5990] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6004] device (eth1): disconnecting for new activation request.
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6006] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6019] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6024] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6027] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6032] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <warn>  [1769454805.6034] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6039] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6046] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (edf6e5d5-cf39-40d5-ba6e-7affcb1d4bb1)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6047] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6053] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6055] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6058] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6062] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <warn>  [1769454805.6064] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6069] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6076] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (3248c519-e8ac-4138-96a3-845c9cb4468f)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6078] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6083] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6086] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6088] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6093] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <warn>  [1769454805.6095] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6099] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6106] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (91be9deb-2b01-49d4-8963-1a1c64595d55)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6108] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6113] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6116] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6117] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6121] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6150] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.method" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6154] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6160] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6164] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6179] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6187] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6206] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 kernel: ovs-system: entered promiscuous mode
Jan 26 19:13:25 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6215] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6218] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6227] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6233] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6240] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6243] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6252] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6259] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6265] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 systemd-udevd[58274]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:13:25 compute-0 kernel: Timeout policy base is empty
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6268] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6277] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6287] dhcp4 (eth0): canceled DHCP transaction
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6287] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6287] dhcp4 (eth0): state changed no lease
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6290] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6314] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6325] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58270 uid=0 result="fail" reason="Device is not activated"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6332] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 26 19:13:25 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6373] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6379] dhcp4 (eth0): state changed new lease, address=38.102.83.58
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6389] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6451] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6656] device (eth1): Activation: starting connection 'ci-private-network' (aba96d01-0b38-5a28-a9c5-4517183729d8)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6666] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6671] device (eth1): state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6689] device (eth1): disconnecting for new activation request.
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6690] audit: op="connection-activate" uuid="aba96d01-0b38-5a28-a9c5-4517183729d8" name="ci-private-network" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6712] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6721] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6732] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6736] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6739] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6743] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6746] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6749] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6765] device (eth1): Activation: starting connection 'ci-private-network' (aba96d01-0b38-5a28-a9c5-4517183729d8)
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6782] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6792] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6801] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6809] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6818] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6827] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6836] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6845] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6854] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6863] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6875] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6883] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6896] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58270 uid=0 result="success"
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6902] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 kernel: br-ex: entered promiscuous mode
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6978] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.6987] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 kernel: vlan22: entered promiscuous mode
Jan 26 19:13:25 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 26 19:13:25 compute-0 systemd-udevd[58276]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7112] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7115] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7126] device (eth1): Activation: successful, device activated.
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7170] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7196] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 kernel: vlan21: entered promiscuous mode
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7236] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7253] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7270] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7279] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 19:13:25 compute-0 kernel: vlan20: entered promiscuous mode
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7296] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7325] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7328] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7340] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7400] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7422] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7450] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7458] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7459] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7468] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7503] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7556] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7558] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 26 19:13:25 compute-0 NetworkManager[55489]: <info>  [1769454805.7568] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 26 19:13:26 compute-0 sudo[58605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usdokmldfxftybrvtakkpseailnkasfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454806.2194421-565-243235766601871/AnsiballZ_async_status.py'
Jan 26 19:13:26 compute-0 sudo[58605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:26 compute-0 python3.9[58607]: ansible-ansible.legacy.async_status Invoked with jid=j27497534321.58264 mode=status _async_dir=/root/.ansible_async
Jan 26 19:13:26 compute-0 sudo[58605]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:26 compute-0 NetworkManager[55489]: <info>  [1769454806.9567] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58270 uid=0 result="success"
Jan 26 19:13:27 compute-0 NetworkManager[55489]: <info>  [1769454807.1779] checkpoint[0x563406684950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 26 19:13:27 compute-0 NetworkManager[55489]: <info>  [1769454807.1781] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58270 uid=0 result="success"
Jan 26 19:13:27 compute-0 NetworkManager[55489]: <info>  [1769454807.4356] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58270 uid=0 result="success"
Jan 26 19:13:27 compute-0 NetworkManager[55489]: <info>  [1769454807.4370] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58270 uid=0 result="success"
Jan 26 19:13:27 compute-0 NetworkManager[55489]: <info>  [1769454807.6512] audit: op="networking-control" arg="global-dns-configuration" pid=58270 uid=0 result="success"
Jan 26 19:13:27 compute-0 NetworkManager[55489]: <info>  [1769454807.6544] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 26 19:13:27 compute-0 NetworkManager[55489]: <info>  [1769454807.6579] audit: op="networking-control" arg="global-dns-configuration" pid=58270 uid=0 result="success"
Jan 26 19:13:27 compute-0 NetworkManager[55489]: <info>  [1769454807.6606] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58270 uid=0 result="success"
Jan 26 19:13:27 compute-0 NetworkManager[55489]: <info>  [1769454807.7964] checkpoint[0x563406684a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 26 19:13:27 compute-0 NetworkManager[55489]: <info>  [1769454807.7969] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58270 uid=0 result="success"
Jan 26 19:13:27 compute-0 ansible-async_wrapper.py[58268]: Module complete (58268)
Jan 26 19:13:28 compute-0 ansible-async_wrapper.py[58267]: Done in kid B.
Jan 26 19:13:30 compute-0 sudo[58711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjnawmwqlvrsvxdzdbeihpevvvhppngm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454806.2194421-565-243235766601871/AnsiballZ_async_status.py'
Jan 26 19:13:30 compute-0 sudo[58711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:30 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 19:13:31 compute-0 python3.9[58713]: ansible-ansible.legacy.async_status Invoked with jid=j27497534321.58264 mode=status _async_dir=/root/.ansible_async
Jan 26 19:13:31 compute-0 sudo[58711]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:31 compute-0 sudo[58813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unelbjovqzqgondrrifmhrjxxrtitiag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454806.2194421-565-243235766601871/AnsiballZ_async_status.py'
Jan 26 19:13:31 compute-0 sudo[58813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:31 compute-0 python3.9[58815]: ansible-ansible.legacy.async_status Invoked with jid=j27497534321.58264 mode=cleanup _async_dir=/root/.ansible_async
Jan 26 19:13:31 compute-0 sudo[58813]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:32 compute-0 sudo[58965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zikfrthydodjybpvnfojcimedpahdyku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454811.8154924-619-168364594724210/AnsiballZ_stat.py'
Jan 26 19:13:32 compute-0 sudo[58965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:32 compute-0 python3.9[58967]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:13:32 compute-0 sudo[58965]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:32 compute-0 sudo[59088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlgeyujhtlimdbfxbjyuhtojfgerbwvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454811.8154924-619-168364594724210/AnsiballZ_copy.py'
Jan 26 19:13:32 compute-0 sudo[59088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:33 compute-0 python3.9[59090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769454811.8154924-619-168364594724210/.source.returncode _original_basename=.r7nkne9j follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:13:33 compute-0 sudo[59088]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:34 compute-0 sudo[59241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvifuddyudxyfmzsfsawpjdleridihyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454813.8432615-651-105224045085443/AnsiballZ_stat.py'
Jan 26 19:13:34 compute-0 sudo[59241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:34 compute-0 python3.9[59243]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:13:34 compute-0 sudo[59241]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:34 compute-0 sudo[59364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clwqgkrolpptxwjuaiegffrysmedhdbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454813.8432615-651-105224045085443/AnsiballZ_copy.py'
Jan 26 19:13:34 compute-0 sudo[59364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:35 compute-0 python3.9[59366]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769454813.8432615-651-105224045085443/.source.cfg _original_basename=.ep7c7ppn follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:13:35 compute-0 sudo[59364]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:35 compute-0 sudo[59516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbspdmlprotcemdhigjqkvrjypljqnqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454815.304967-681-244146249522350/AnsiballZ_systemd.py'
Jan 26 19:13:35 compute-0 sudo[59516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:35 compute-0 python3.9[59518]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:13:36 compute-0 systemd[1]: Reloading Network Manager...
Jan 26 19:13:36 compute-0 NetworkManager[55489]: <info>  [1769454816.0833] audit: op="reload" arg="0" pid=59522 uid=0 result="success"
Jan 26 19:13:36 compute-0 NetworkManager[55489]: <info>  [1769454816.0845] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 26 19:13:36 compute-0 systemd[1]: Reloaded Network Manager.
Jan 26 19:13:36 compute-0 sudo[59516]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:36 compute-0 sshd-session[51495]: Connection closed by 192.168.122.30 port 38546
Jan 26 19:13:36 compute-0 sshd-session[51492]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:13:36 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Jan 26 19:13:36 compute-0 systemd[1]: session-12.scope: Consumed 53.753s CPU time.
Jan 26 19:13:36 compute-0 systemd-logind[794]: Session 12 logged out. Waiting for processes to exit.
Jan 26 19:13:36 compute-0 systemd-logind[794]: Removed session 12.
Jan 26 19:13:41 compute-0 sshd-session[59553]: Accepted publickey for zuul from 192.168.122.30 port 57084 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:13:41 compute-0 systemd-logind[794]: New session 13 of user zuul.
Jan 26 19:13:41 compute-0 systemd[1]: Started Session 13 of User zuul.
Jan 26 19:13:41 compute-0 sshd-session[59553]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:13:42 compute-0 python3.9[59706]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:13:43 compute-0 python3.9[59861]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:13:45 compute-0 python3.9[60050]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:13:45 compute-0 sshd-session[59556]: Connection closed by 192.168.122.30 port 57084
Jan 26 19:13:45 compute-0 sshd-session[59553]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:13:45 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Jan 26 19:13:45 compute-0 systemd[1]: session-13.scope: Consumed 2.614s CPU time.
Jan 26 19:13:45 compute-0 systemd-logind[794]: Session 13 logged out. Waiting for processes to exit.
Jan 26 19:13:45 compute-0 systemd-logind[794]: Removed session 13.
Jan 26 19:13:46 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 26 19:13:50 compute-0 sshd-session[60079]: Accepted publickey for zuul from 192.168.122.30 port 44392 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:13:50 compute-0 systemd-logind[794]: New session 14 of user zuul.
Jan 26 19:13:50 compute-0 systemd[1]: Started Session 14 of User zuul.
Jan 26 19:13:50 compute-0 sshd-session[60079]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:13:51 compute-0 python3.9[60232]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:13:52 compute-0 python3.9[60386]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:13:53 compute-0 sudo[60541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvcmrjpsxoczxcbqodnbztewoimiornn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454833.2056851-55-69005342737584/AnsiballZ_setup.py'
Jan 26 19:13:53 compute-0 sudo[60541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:53 compute-0 python3.9[60543]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:13:54 compute-0 sudo[60541]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:54 compute-0 sudo[60625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpevqpscxefpvrjgdmebsuflqvkyamou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454833.2056851-55-69005342737584/AnsiballZ_dnf.py'
Jan 26 19:13:54 compute-0 sudo[60625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:54 compute-0 python3.9[60627]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:13:55 compute-0 sudo[60625]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:56 compute-0 sudo[60779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqjpquvwwvpbeixrflojdovdczjnoidm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454836.067044-79-123581980321514/AnsiballZ_setup.py'
Jan 26 19:13:56 compute-0 sudo[60779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:56 compute-0 python3.9[60781]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:13:57 compute-0 sudo[60779]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:57 compute-0 sudo[60970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnbcpjrfnltifiyzxahmkruaicvmogoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454837.3527298-101-256798171336159/AnsiballZ_file.py'
Jan 26 19:13:57 compute-0 sudo[60970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:58 compute-0 python3.9[60972]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:13:58 compute-0 sudo[60970]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:58 compute-0 sudo[61122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bahxgmyktjxavgqziqujnxjzqhqmdolb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454838.2475414-117-79831190673820/AnsiballZ_command.py'
Jan 26 19:13:58 compute-0 sudo[61122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:58 compute-0 python3.9[61124]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:13:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:13:58 compute-0 sudo[61122]: pam_unix(sudo:session): session closed for user root
Jan 26 19:13:59 compute-0 sudo[61286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amlciktoshpblmsyudgyhwcsofkwtfoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454839.1870754-133-51650266330199/AnsiballZ_stat.py'
Jan 26 19:13:59 compute-0 sudo[61286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:13:59 compute-0 python3.9[61288]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:13:59 compute-0 sudo[61286]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:00 compute-0 sudo[61364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpecyulsypwotssqkziuemixemhoxfip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454839.1870754-133-51650266330199/AnsiballZ_file.py'
Jan 26 19:14:00 compute-0 sudo[61364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:00 compute-0 python3.9[61366]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:14:00 compute-0 sudo[61364]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:00 compute-0 sudo[61516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esnfklhqiqudkrhvkpejqntqeshljuai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454840.663405-157-235528224856485/AnsiballZ_stat.py'
Jan 26 19:14:00 compute-0 sudo[61516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:01 compute-0 python3.9[61518]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:14:01 compute-0 sudo[61516]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:01 compute-0 sudo[61594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhpopodmlovnixfjjerngvemnqgfbvpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454840.663405-157-235528224856485/AnsiballZ_file.py'
Jan 26 19:14:01 compute-0 sudo[61594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:01 compute-0 python3.9[61596]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:14:01 compute-0 sudo[61594]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:02 compute-0 sudo[61746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uajgxpukrzyphbnlolctfokelrqleeow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454841.9185169-183-242744397686467/AnsiballZ_ini_file.py'
Jan 26 19:14:02 compute-0 sudo[61746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:02 compute-0 python3.9[61748]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:14:02 compute-0 sudo[61746]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:03 compute-0 sudo[61898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onrayfrtdxnssmbdawyvyvjodvuaepwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454842.8594925-183-195664988957576/AnsiballZ_ini_file.py'
Jan 26 19:14:03 compute-0 sudo[61898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:03 compute-0 python3.9[61900]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:14:03 compute-0 sudo[61898]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:03 compute-0 sudo[62050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezlodwhgsmleoefelyafthjrkvhxcdsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454843.599327-183-146623301586374/AnsiballZ_ini_file.py'
Jan 26 19:14:03 compute-0 sudo[62050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:04 compute-0 python3.9[62052]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:14:04 compute-0 sudo[62050]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:04 compute-0 sudo[62202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wotajbgjbivumhvckqydgbhldhomdbiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454844.2321892-183-73936034897857/AnsiballZ_ini_file.py'
Jan 26 19:14:04 compute-0 sudo[62202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:04 compute-0 python3.9[62204]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:14:04 compute-0 sudo[62202]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:05 compute-0 sudo[62354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mevibgfnxdwukofyceclgmmojvtzcbpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454845.0631514-245-124919458260372/AnsiballZ_dnf.py'
Jan 26 19:14:05 compute-0 sudo[62354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:05 compute-0 python3.9[62356]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:14:07 compute-0 sudo[62354]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:07 compute-0 sudo[62507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxyjupuuintqvbiumcuimdqspdcvouvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454847.5275733-267-10585496429533/AnsiballZ_setup.py'
Jan 26 19:14:07 compute-0 sudo[62507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:08 compute-0 python3.9[62509]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:14:08 compute-0 sudo[62507]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:08 compute-0 sudo[62661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsdvnbkhfqguniwnbiwnuxyugttjdsmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454848.453819-283-124571761378144/AnsiballZ_stat.py'
Jan 26 19:14:08 compute-0 sudo[62661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:09 compute-0 python3.9[62663]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:14:09 compute-0 sudo[62661]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:09 compute-0 sudo[62813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iriblfsjqefgsjnndsqyuwjpjytkxxfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454849.3192272-301-138419730926648/AnsiballZ_stat.py'
Jan 26 19:14:09 compute-0 sudo[62813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:09 compute-0 python3.9[62815]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:14:09 compute-0 sudo[62813]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:10 compute-0 sudo[62965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwjzaxkuyrvntctmwzpaqpwjlyzumfqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454850.2832952-321-186080378018299/AnsiballZ_command.py'
Jan 26 19:14:10 compute-0 sudo[62965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:10 compute-0 python3.9[62967]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:14:10 compute-0 sudo[62965]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:11 compute-0 sudo[63118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krqhzycuphsjvjrzirtxpvukfikyzumq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454851.2315686-341-125638019405974/AnsiballZ_service_facts.py'
Jan 26 19:14:11 compute-0 sudo[63118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:12 compute-0 python3.9[63120]: ansible-service_facts Invoked
Jan 26 19:14:12 compute-0 network[63137]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 19:14:12 compute-0 network[63138]: 'network-scripts' will be removed from distribution in near future.
Jan 26 19:14:12 compute-0 network[63139]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 19:14:16 compute-0 sudo[63118]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:18 compute-0 sudo[63422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlgbkllueqfnsjtxwimcossdumwndvhw ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769454858.400827-371-151743057369253/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769454858.400827-371-151743057369253/args'
Jan 26 19:14:18 compute-0 sudo[63422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:18 compute-0 sudo[63422]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:19 compute-0 sudo[63589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsgndsyrsahgykqzhglvqfnyrnfkubpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454859.2009604-393-163710401564429/AnsiballZ_dnf.py'
Jan 26 19:14:19 compute-0 sudo[63589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:19 compute-0 python3.9[63591]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:14:20 compute-0 sudo[63589]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:22 compute-0 sudo[63742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmpksilrivyvywcykhykdjekarzrjeex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454861.510087-419-203158777994927/AnsiballZ_package_facts.py'
Jan 26 19:14:22 compute-0 sudo[63742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:22 compute-0 python3.9[63744]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 26 19:14:22 compute-0 sudo[63742]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:23 compute-0 sudo[63894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbdowjncxinknbnmgeaytukgxuybncbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454863.3001418-439-14522230000054/AnsiballZ_stat.py'
Jan 26 19:14:23 compute-0 sudo[63894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:23 compute-0 python3.9[63896]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:14:23 compute-0 sudo[63894]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:24 compute-0 sudo[64019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdrcklkexviarzzxzwbnbpdfqdoigncv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454863.3001418-439-14522230000054/AnsiballZ_copy.py'
Jan 26 19:14:24 compute-0 sudo[64019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:24 compute-0 python3.9[64021]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769454863.3001418-439-14522230000054/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:14:24 compute-0 sudo[64019]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:25 compute-0 sudo[64173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyzinkmopnumjzehimqudqjmqfdlilwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454864.8099275-469-89484154654080/AnsiballZ_stat.py'
Jan 26 19:14:25 compute-0 sudo[64173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:25 compute-0 python3.9[64175]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:14:25 compute-0 sudo[64173]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:25 compute-0 sudo[64298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozrxxtdspiwiyqudfsxrriswekehtbbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454864.8099275-469-89484154654080/AnsiballZ_copy.py'
Jan 26 19:14:25 compute-0 sudo[64298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:26 compute-0 python3.9[64300]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769454864.8099275-469-89484154654080/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:14:26 compute-0 sudo[64298]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:27 compute-0 sudo[64452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwwnpxhqpzrluusvngkqetvjbeteutyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454866.876432-511-223780483891846/AnsiballZ_lineinfile.py'
Jan 26 19:14:27 compute-0 sudo[64452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:27 compute-0 python3.9[64454]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:14:27 compute-0 sudo[64452]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:28 compute-0 sudo[64606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuammgmbrxetxyfhcxstrbwcyouktfil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454868.4713814-541-227613231685314/AnsiballZ_setup.py'
Jan 26 19:14:28 compute-0 sudo[64606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:29 compute-0 python3.9[64608]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:14:29 compute-0 sudo[64606]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:29 compute-0 sudo[64690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sinxazukzqotwxqmmyratokxmgeunglo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454868.4713814-541-227613231685314/AnsiballZ_systemd.py'
Jan 26 19:14:29 compute-0 sudo[64690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:30 compute-0 python3.9[64692]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:14:30 compute-0 sudo[64690]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:31 compute-0 sudo[64844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxmasokjlmglfuhatgmscqyrprxmcfed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454871.02528-573-213585767994778/AnsiballZ_setup.py'
Jan 26 19:14:31 compute-0 sudo[64844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:31 compute-0 python3.9[64846]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:14:31 compute-0 sudo[64844]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:32 compute-0 sudo[64928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyaouunfjtdzgthkjaiwglebzwgaunar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454871.02528-573-213585767994778/AnsiballZ_systemd.py'
Jan 26 19:14:32 compute-0 sudo[64928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:32 compute-0 python3.9[64930]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:14:32 compute-0 systemd[1]: Stopping NTP client/server...
Jan 26 19:14:32 compute-0 chronyd[783]: chronyd exiting
Jan 26 19:14:32 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Jan 26 19:14:32 compute-0 systemd[1]: Stopped NTP client/server.
Jan 26 19:14:32 compute-0 systemd[1]: Starting NTP client/server...
Jan 26 19:14:32 compute-0 chronyd[64940]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 26 19:14:32 compute-0 chronyd[64940]: Frequency -26.440 +/- 0.108 ppm read from /var/lib/chrony/drift
Jan 26 19:14:32 compute-0 chronyd[64940]: Loaded seccomp filter (level 2)
Jan 26 19:14:32 compute-0 systemd[1]: Started NTP client/server.
Jan 26 19:14:32 compute-0 sudo[64928]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:32 compute-0 sshd-session[60082]: Connection closed by 192.168.122.30 port 44392
Jan 26 19:14:32 compute-0 sshd-session[60079]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:14:32 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Jan 26 19:14:32 compute-0 systemd[1]: session-14.scope: Consumed 29.381s CPU time.
Jan 26 19:14:32 compute-0 systemd-logind[794]: Session 14 logged out. Waiting for processes to exit.
Jan 26 19:14:32 compute-0 systemd-logind[794]: Removed session 14.
Jan 26 19:14:38 compute-0 sshd-session[64967]: Accepted publickey for zuul from 192.168.122.30 port 55922 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:14:38 compute-0 systemd-logind[794]: New session 15 of user zuul.
Jan 26 19:14:38 compute-0 systemd[1]: Started Session 15 of User zuul.
Jan 26 19:14:38 compute-0 sshd-session[64967]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:14:39 compute-0 python3.9[65120]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:14:40 compute-0 sudo[65274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byosaeijomzohrkwdocytpkkfhmcrhch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454880.0049489-41-275548704421587/AnsiballZ_file.py'
Jan 26 19:14:40 compute-0 sudo[65274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:40 compute-0 python3.9[65276]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:14:40 compute-0 sudo[65274]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:41 compute-0 sudo[65449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arfatgantxkfuivhujfhgijmbkmrancv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454881.0490336-57-12544381041020/AnsiballZ_stat.py'
Jan 26 19:14:41 compute-0 sudo[65449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:41 compute-0 python3.9[65451]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:14:41 compute-0 sudo[65449]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:42 compute-0 sudo[65527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqzpwyoudpjmfurzdrunoxljtlavcyvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454881.0490336-57-12544381041020/AnsiballZ_file.py'
Jan 26 19:14:42 compute-0 sudo[65527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:42 compute-0 python3.9[65529]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.5gsoln0v recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:14:42 compute-0 sudo[65527]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:43 compute-0 sudo[65679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkaarkthztspcleewhheupiseimberlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454882.8623722-97-191741293483758/AnsiballZ_stat.py'
Jan 26 19:14:43 compute-0 sudo[65679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:43 compute-0 python3.9[65681]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:14:43 compute-0 sudo[65679]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:43 compute-0 sudo[65802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlmdwjqxthippscgcogoptpnwxvaekxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454882.8623722-97-191741293483758/AnsiballZ_copy.py'
Jan 26 19:14:43 compute-0 sudo[65802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:44 compute-0 python3.9[65804]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769454882.8623722-97-191741293483758/.source _original_basename=.x2hs8gt3 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:14:44 compute-0 sudo[65802]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:44 compute-0 sudo[65954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfyawuajvjwlrqesqgpdzstrrydmuxph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454884.3408585-129-216157907112900/AnsiballZ_file.py'
Jan 26 19:14:44 compute-0 sudo[65954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:44 compute-0 python3.9[65956]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:14:44 compute-0 sudo[65954]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:45 compute-0 sudo[66106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsplnslgcvjdmtlkbutyzdjvmbtjicrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454885.1536908-145-239366081998445/AnsiballZ_stat.py'
Jan 26 19:14:45 compute-0 sudo[66106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:45 compute-0 python3.9[66108]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:14:45 compute-0 sudo[66106]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:46 compute-0 sudo[66229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prmubpdrmuabmqcpxeztfialuhxlxbpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454885.1536908-145-239366081998445/AnsiballZ_copy.py'
Jan 26 19:14:46 compute-0 sudo[66229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:46 compute-0 python3.9[66231]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769454885.1536908-145-239366081998445/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:14:46 compute-0 sudo[66229]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:46 compute-0 sudo[66381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsljrykqmratbbdhdfqezkdzpbtkytxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454886.5653572-145-169922544757931/AnsiballZ_stat.py'
Jan 26 19:14:46 compute-0 sudo[66381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:47 compute-0 python3.9[66383]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:14:47 compute-0 sudo[66381]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:47 compute-0 sudo[66504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbtbmoorzgodqfwnyhbezzgfpdiwdmhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454886.5653572-145-169922544757931/AnsiballZ_copy.py'
Jan 26 19:14:47 compute-0 sudo[66504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:47 compute-0 python3.9[66506]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769454886.5653572-145-169922544757931/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:14:47 compute-0 sudo[66504]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:48 compute-0 sudo[66656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdilfzdhhqfszgehragmszaziiwmznak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454888.1019084-203-233846105805764/AnsiballZ_file.py'
Jan 26 19:14:48 compute-0 sudo[66656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:48 compute-0 python3.9[66658]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:14:48 compute-0 sudo[66656]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:49 compute-0 sudo[66808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnbabzntzsjnvslgrbfveyokuagraddd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454888.9174943-219-183019631792560/AnsiballZ_stat.py'
Jan 26 19:14:49 compute-0 sudo[66808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:49 compute-0 python3.9[66810]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:14:49 compute-0 sudo[66808]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:49 compute-0 sudo[66931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzjvhzedbidebqqhfvdgyurmvdnredon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454888.9174943-219-183019631792560/AnsiballZ_copy.py'
Jan 26 19:14:49 compute-0 sudo[66931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:50 compute-0 python3.9[66933]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454888.9174943-219-183019631792560/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:14:50 compute-0 sudo[66931]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:50 compute-0 sudo[67083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qulkeahnttxjxkyqugwppbyyzlzjhuvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454890.3864875-249-95832064206420/AnsiballZ_stat.py'
Jan 26 19:14:50 compute-0 sudo[67083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:50 compute-0 python3.9[67085]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:14:50 compute-0 sudo[67083]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:51 compute-0 sudo[67206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyhvweivskxebxlwkhdbulnjoldpcooo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454890.3864875-249-95832064206420/AnsiballZ_copy.py'
Jan 26 19:14:51 compute-0 sudo[67206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:51 compute-0 python3.9[67208]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454890.3864875-249-95832064206420/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:14:51 compute-0 sudo[67206]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:52 compute-0 sudo[67358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvrtirsjvlbakuiuerxswzndiygofqna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454891.8164074-279-27633404440886/AnsiballZ_systemd.py'
Jan 26 19:14:52 compute-0 sudo[67358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:52 compute-0 python3.9[67360]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:14:52 compute-0 systemd[1]: Reloading.
Jan 26 19:14:52 compute-0 systemd-rc-local-generator[67388]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:14:52 compute-0 systemd-sysv-generator[67392]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:14:53 compute-0 systemd[1]: Reloading.
Jan 26 19:14:53 compute-0 systemd-rc-local-generator[67420]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:14:53 compute-0 systemd-sysv-generator[67423]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:14:53 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Jan 26 19:14:53 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Jan 26 19:14:53 compute-0 sudo[67358]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:53 compute-0 sudo[67586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqsgukbyttebhlrvwcylciglbdkskuci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454893.4518778-295-100941392994924/AnsiballZ_stat.py'
Jan 26 19:14:53 compute-0 sudo[67586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:53 compute-0 python3.9[67588]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:14:53 compute-0 sudo[67586]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:54 compute-0 sudo[67709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nibfzghsmzntuhbhzcahpfhoualoqxvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454893.4518778-295-100941392994924/AnsiballZ_copy.py'
Jan 26 19:14:54 compute-0 sudo[67709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:54 compute-0 python3.9[67711]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454893.4518778-295-100941392994924/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:14:54 compute-0 sudo[67709]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:54 compute-0 sudo[67861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsqnkgguziigovfyrguezxxzmtsbfdof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454894.6894827-325-190005569535923/AnsiballZ_stat.py'
Jan 26 19:14:54 compute-0 sudo[67861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:55 compute-0 python3.9[67863]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:14:55 compute-0 sudo[67861]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:55 compute-0 sudo[67984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pycwzfterkmxwislfilpwdxkrwizffue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454894.6894827-325-190005569535923/AnsiballZ_copy.py'
Jan 26 19:14:55 compute-0 sudo[67984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:55 compute-0 python3.9[67986]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454894.6894827-325-190005569535923/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:14:55 compute-0 sudo[67984]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:56 compute-0 sudo[68136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpjzfcphoovlqlbeonqynsijopdgrbje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454895.9874275-355-50831269640769/AnsiballZ_systemd.py'
Jan 26 19:14:56 compute-0 sudo[68136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:14:56 compute-0 python3.9[68138]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:14:56 compute-0 systemd[1]: Reloading.
Jan 26 19:14:56 compute-0 systemd-rc-local-generator[68159]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:14:56 compute-0 systemd-sysv-generator[68164]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:14:56 compute-0 systemd[1]: Reloading.
Jan 26 19:14:56 compute-0 systemd-rc-local-generator[68197]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:14:56 compute-0 systemd-sysv-generator[68201]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:14:57 compute-0 systemd[1]: Starting Create netns directory...
Jan 26 19:14:57 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 19:14:57 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 19:14:57 compute-0 systemd[1]: Finished Create netns directory.
Jan 26 19:14:57 compute-0 sudo[68136]: pam_unix(sudo:session): session closed for user root
Jan 26 19:14:58 compute-0 python3.9[68364]: ansible-ansible.builtin.service_facts Invoked
Jan 26 19:14:58 compute-0 network[68381]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 19:14:58 compute-0 network[68382]: 'network-scripts' will be removed from distribution in near future.
Jan 26 19:14:58 compute-0 network[68383]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 19:15:01 compute-0 sudo[68645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdrenntiadkedtqofibhztqykrahlwxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454901.4345489-387-91011589976191/AnsiballZ_systemd.py'
Jan 26 19:15:01 compute-0 sudo[68645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:02 compute-0 python3.9[68647]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:15:02 compute-0 sshd-session[68617]: Invalid user oracle from 193.32.162.151 port 38960
Jan 26 19:15:02 compute-0 systemd[1]: Reloading.
Jan 26 19:15:02 compute-0 systemd-rc-local-generator[68673]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:15:02 compute-0 systemd-sysv-generator[68680]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:15:02 compute-0 sshd-session[68617]: Connection closed by invalid user oracle 193.32.162.151 port 38960 [preauth]
Jan 26 19:15:02 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 26 19:15:02 compute-0 iptables.init[68688]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 26 19:15:02 compute-0 iptables.init[68688]: iptables: Flushing firewall rules: [  OK  ]
Jan 26 19:15:02 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Jan 26 19:15:02 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 26 19:15:02 compute-0 sudo[68645]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:03 compute-0 sudo[68882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uynbnkkkihydibtrltflwfppaqjyusuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454902.9469635-387-123202788127415/AnsiballZ_systemd.py'
Jan 26 19:15:03 compute-0 sudo[68882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:03 compute-0 python3.9[68884]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:15:03 compute-0 sudo[68882]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:04 compute-0 sudo[69036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhqdeyrbblfkkuplrzyflferhtfhjdmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454904.0107143-419-138891015906317/AnsiballZ_systemd.py'
Jan 26 19:15:04 compute-0 sudo[69036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:04 compute-0 python3.9[69038]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:15:04 compute-0 systemd[1]: Reloading.
Jan 26 19:15:04 compute-0 systemd-rc-local-generator[69070]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:15:04 compute-0 systemd-sysv-generator[69074]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:15:04 compute-0 systemd[1]: Starting Netfilter Tables...
Jan 26 19:15:04 compute-0 systemd[1]: Finished Netfilter Tables.
Jan 26 19:15:05 compute-0 sudo[69036]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:05 compute-0 sudo[69230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znnrebukxgjajdxzfbkrjsbwdrspfxcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454905.2245848-435-151582580054596/AnsiballZ_command.py'
Jan 26 19:15:05 compute-0 sudo[69230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:05 compute-0 python3.9[69232]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:15:06 compute-0 sudo[69230]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:06 compute-0 sudo[69383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyalvjxnulxberkklwcwbxzsrmypblnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454906.469526-463-144285930299815/AnsiballZ_stat.py'
Jan 26 19:15:06 compute-0 sudo[69383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:07 compute-0 python3.9[69385]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:15:07 compute-0 sudo[69383]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:07 compute-0 sudo[69508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhqtiuszivwcvzthoiweykftptjzgvww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454906.469526-463-144285930299815/AnsiballZ_copy.py'
Jan 26 19:15:07 compute-0 sudo[69508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:07 compute-0 python3.9[69510]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769454906.469526-463-144285930299815/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:07 compute-0 sudo[69508]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:08 compute-0 sudo[69661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vszsgvsdttnexdxyyqzxwlelcumpgfwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454908.0244327-493-149280946046169/AnsiballZ_systemd.py'
Jan 26 19:15:08 compute-0 sudo[69661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:08 compute-0 python3.9[69663]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:15:08 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Jan 26 19:15:08 compute-0 sshd[1002]: Received SIGHUP; restarting.
Jan 26 19:15:08 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Jan 26 19:15:08 compute-0 sshd[1002]: Server listening on 0.0.0.0 port 22.
Jan 26 19:15:08 compute-0 sshd[1002]: Server listening on :: port 22.
Jan 26 19:15:08 compute-0 sudo[69661]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:09 compute-0 sudo[69817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjagtcuwghsanrcbketsnrvekpbinecq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454909.0015762-509-104527217703889/AnsiballZ_file.py'
Jan 26 19:15:09 compute-0 sudo[69817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:09 compute-0 python3.9[69819]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:09 compute-0 sudo[69817]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:10 compute-0 sudo[69969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-darodizioyhxbaimlevwcgbaaqhcnyrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454909.7886868-525-140293847176822/AnsiballZ_stat.py'
Jan 26 19:15:10 compute-0 sudo[69969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:10 compute-0 python3.9[69971]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:15:10 compute-0 sudo[69969]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:10 compute-0 sudo[70092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-damlpafywirigmooylpylihxzlvzwwow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454909.7886868-525-140293847176822/AnsiballZ_copy.py'
Jan 26 19:15:10 compute-0 sudo[70092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:10 compute-0 python3.9[70094]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454909.7886868-525-140293847176822/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:10 compute-0 sudo[70092]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:11 compute-0 sudo[70244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waysnmhjcgnuxkobvlouteorrqwfcnjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454911.3417659-561-152872979559144/AnsiballZ_timezone.py'
Jan 26 19:15:11 compute-0 sudo[70244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:12 compute-0 python3.9[70246]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 26 19:15:12 compute-0 systemd[1]: Starting Time & Date Service...
Jan 26 19:15:12 compute-0 systemd[1]: Started Time & Date Service.
Jan 26 19:15:12 compute-0 sudo[70244]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:12 compute-0 sudo[70400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpryudbsuezmfjqgztjfapwtnsjewagc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454912.5678885-579-373595983531/AnsiballZ_file.py'
Jan 26 19:15:12 compute-0 sudo[70400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:13 compute-0 python3.9[70402]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:13 compute-0 sudo[70400]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:13 compute-0 sudo[70552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqbwxmobqouypkukdsnmxyebxsnkiyza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454913.3995233-595-91694954318357/AnsiballZ_stat.py'
Jan 26 19:15:13 compute-0 sudo[70552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:14 compute-0 python3.9[70554]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:15:14 compute-0 sudo[70552]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:14 compute-0 sudo[70675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgrmjfzkgxgkpifxuonixsbqyykgjnth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454913.3995233-595-91694954318357/AnsiballZ_copy.py'
Jan 26 19:15:14 compute-0 sudo[70675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:14 compute-0 python3.9[70677]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769454913.3995233-595-91694954318357/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:14 compute-0 sudo[70675]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:15 compute-0 sudo[70827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtjlrvdysaoaqaawwearytnipyhzojbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454914.9080064-625-168397642109655/AnsiballZ_stat.py'
Jan 26 19:15:15 compute-0 sudo[70827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:15 compute-0 python3.9[70829]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:15:15 compute-0 sudo[70827]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:15 compute-0 sudo[70950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhnjimbfcjkjcumkucigzjwsjrhqlxyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454914.9080064-625-168397642109655/AnsiballZ_copy.py'
Jan 26 19:15:15 compute-0 sudo[70950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:16 compute-0 python3.9[70952]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769454914.9080064-625-168397642109655/.source.yaml _original_basename=.2_er7e5v follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:16 compute-0 sudo[70950]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:16 compute-0 sudo[71102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldcrlmgureksezhchkmujuhygqxbaipp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454916.3335514-655-171564178770256/AnsiballZ_stat.py'
Jan 26 19:15:16 compute-0 sudo[71102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:16 compute-0 python3.9[71104]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:15:16 compute-0 sudo[71102]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:17 compute-0 sudo[71225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucgjmexgruaessmiipihghwemkvhymzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454916.3335514-655-171564178770256/AnsiballZ_copy.py'
Jan 26 19:15:17 compute-0 sudo[71225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:17 compute-0 python3.9[71227]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454916.3335514-655-171564178770256/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:17 compute-0 sudo[71225]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:18 compute-0 sudo[71377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqahdzrbraokxjpkaemcmvvfyhsanrpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454917.870286-685-222095702460659/AnsiballZ_command.py'
Jan 26 19:15:18 compute-0 sudo[71377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:18 compute-0 python3.9[71379]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:15:18 compute-0 sudo[71377]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:19 compute-0 sudo[71530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgpjxssijfvrakqtuktymfnsxpymvipy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454918.7192664-701-160145061275610/AnsiballZ_command.py'
Jan 26 19:15:19 compute-0 sudo[71530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:19 compute-0 python3.9[71532]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:15:19 compute-0 sudo[71530]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:19 compute-0 sudo[71683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sueobuhhopcisxtjarqpdnswezrfwehp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769454919.4359598-717-135138053240436/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 19:15:19 compute-0 sudo[71683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:20 compute-0 python3[71685]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 19:15:20 compute-0 sudo[71683]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:20 compute-0 sudo[71835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pppiskcdrnugsbxzlzupvevlirhheyqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454920.422358-733-187017933826700/AnsiballZ_stat.py'
Jan 26 19:15:20 compute-0 sudo[71835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:21 compute-0 python3.9[71837]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:15:21 compute-0 sudo[71835]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:21 compute-0 sudo[71958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhsqjkhavqhcuommiuxahniplofzdcky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454920.422358-733-187017933826700/AnsiballZ_copy.py'
Jan 26 19:15:21 compute-0 sudo[71958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:21 compute-0 python3.9[71960]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454920.422358-733-187017933826700/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:21 compute-0 sudo[71958]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:22 compute-0 sudo[72110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odsdxcdxgrpwgbvvbhbapnzyrdntqake ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454921.8998728-763-182113935862758/AnsiballZ_stat.py'
Jan 26 19:15:22 compute-0 sudo[72110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:22 compute-0 python3.9[72112]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:15:22 compute-0 sudo[72110]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:22 compute-0 sudo[72233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzwoduzhwzvhiibwjigpeszcfqkihjmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454921.8998728-763-182113935862758/AnsiballZ_copy.py'
Jan 26 19:15:22 compute-0 sudo[72233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:23 compute-0 python3.9[72235]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454921.8998728-763-182113935862758/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:23 compute-0 sudo[72233]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:23 compute-0 sudo[72385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jajqhottaczpzmvomjklmiiireldihyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454923.3942015-793-15092749294519/AnsiballZ_stat.py'
Jan 26 19:15:23 compute-0 sudo[72385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:24 compute-0 python3.9[72387]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:15:24 compute-0 sudo[72385]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:24 compute-0 sudo[72508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebhhwxjgdrckswbkbcougwmgzhngfmsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454923.3942015-793-15092749294519/AnsiballZ_copy.py'
Jan 26 19:15:24 compute-0 sudo[72508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:24 compute-0 python3.9[72510]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454923.3942015-793-15092749294519/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:24 compute-0 sudo[72508]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:25 compute-0 sudo[72660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xphimrtwyetakpypjdzgcsqcpzjtypoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454924.9682312-823-132112986252614/AnsiballZ_stat.py'
Jan 26 19:15:25 compute-0 sudo[72660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:25 compute-0 python3.9[72662]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:15:25 compute-0 sudo[72660]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:26 compute-0 sudo[72783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzozlprwagsotjqjxytegxegudmebsid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454924.9682312-823-132112986252614/AnsiballZ_copy.py'
Jan 26 19:15:26 compute-0 sudo[72783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:26 compute-0 python3.9[72785]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454924.9682312-823-132112986252614/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:26 compute-0 sudo[72783]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:26 compute-0 sudo[72935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxtxecewoejyaprtmfxhalmsbklynees ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454926.484266-853-116126080828718/AnsiballZ_stat.py'
Jan 26 19:15:26 compute-0 sudo[72935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:27 compute-0 python3.9[72937]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:15:27 compute-0 sudo[72935]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:27 compute-0 sudo[73058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmhvpxalassijixjytqwxdlhfzabrevo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454926.484266-853-116126080828718/AnsiballZ_copy.py'
Jan 26 19:15:27 compute-0 sudo[73058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:27 compute-0 python3.9[73060]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454926.484266-853-116126080828718/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:27 compute-0 sudo[73058]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:28 compute-0 sudo[73210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbvzgcpbynhehuzmqxpkfyupyvqrcpoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454927.9915392-883-257271880274976/AnsiballZ_file.py'
Jan 26 19:15:28 compute-0 sudo[73210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:28 compute-0 python3.9[73212]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:28 compute-0 sudo[73210]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:28 compute-0 sudo[73362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prrdcwxgiuhxlhduufrrzuhypmttwbmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454928.6703937-899-10227520696067/AnsiballZ_command.py'
Jan 26 19:15:28 compute-0 sudo[73362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:29 compute-0 python3.9[73364]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:15:29 compute-0 sudo[73362]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:30 compute-0 sudo[73521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urhhkrenqwjchhyrvppumbdvfhtgfebk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454929.5369406-915-199448481879742/AnsiballZ_blockinfile.py'
Jan 26 19:15:30 compute-0 sudo[73521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:30 compute-0 python3.9[73523]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:30 compute-0 sudo[73521]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:30 compute-0 sudo[73674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgagajvvvmtjqztbabmhrpwukeurpamu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454930.588849-933-245518619792254/AnsiballZ_file.py'
Jan 26 19:15:30 compute-0 sudo[73674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:31 compute-0 python3.9[73676]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:31 compute-0 sudo[73674]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:31 compute-0 sudo[73826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enellncyutpfjhblepuiqmqduxksxmsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454931.3756323-933-236947653653422/AnsiballZ_file.py'
Jan 26 19:15:31 compute-0 sudo[73826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:31 compute-0 python3.9[73828]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:32 compute-0 sudo[73826]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:32 compute-0 sudo[73978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elkskfgyygtdgyvbsmtgmszhvzdmoscc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454932.1986089-963-204872643869667/AnsiballZ_mount.py'
Jan 26 19:15:32 compute-0 sudo[73978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:32 compute-0 python3.9[73980]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 19:15:33 compute-0 sudo[73978]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:33 compute-0 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 19:15:33 compute-0 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 19:15:33 compute-0 sudo[74132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldlqhbghonkikfnulopqdoaufpkmccct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454933.1758332-963-132990752579089/AnsiballZ_mount.py'
Jan 26 19:15:33 compute-0 sudo[74132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:33 compute-0 python3.9[74134]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 26 19:15:33 compute-0 sudo[74132]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:34 compute-0 sshd-session[64970]: Connection closed by 192.168.122.30 port 55922
Jan 26 19:15:34 compute-0 sshd-session[64967]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:15:34 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Jan 26 19:15:34 compute-0 systemd[1]: session-15.scope: Consumed 42.629s CPU time.
Jan 26 19:15:34 compute-0 systemd-logind[794]: Session 15 logged out. Waiting for processes to exit.
Jan 26 19:15:34 compute-0 systemd-logind[794]: Removed session 15.
Jan 26 19:15:40 compute-0 sshd-session[74160]: Accepted publickey for zuul from 192.168.122.30 port 49014 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:15:40 compute-0 systemd-logind[794]: New session 16 of user zuul.
Jan 26 19:15:40 compute-0 systemd[1]: Started Session 16 of User zuul.
Jan 26 19:15:40 compute-0 sshd-session[74160]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:15:40 compute-0 sudo[74314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyqvgvxyggrgtlkadeokypguobudcbux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454940.2776263-17-185535306448452/AnsiballZ_tempfile.py'
Jan 26 19:15:40 compute-0 sudo[74314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:40 compute-0 python3.9[74316]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 26 19:15:40 compute-0 sudo[74314]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:41 compute-0 sudo[74466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djxdsbnhsjczayayxrzllspmdbpixlno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454941.167681-41-212883975407287/AnsiballZ_stat.py'
Jan 26 19:15:41 compute-0 sudo[74466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:41 compute-0 python3.9[74468]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:15:41 compute-0 sudo[74466]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:42 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 19:15:43 compute-0 sudo[74620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gulwrgmjbdfvfnjvjxuwapmvutxiaymb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454942.560799-61-150419607578657/AnsiballZ_setup.py'
Jan 26 19:15:43 compute-0 sudo[74620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:43 compute-0 python3.9[74622]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:15:43 compute-0 sudo[74620]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:44 compute-0 sudo[74772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eotmuarguebjyjbevgedenwkmpbznpva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454943.8210166-78-139433103630696/AnsiballZ_blockinfile.py'
Jan 26 19:15:44 compute-0 sudo[74772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:44 compute-0 python3.9[74774]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCxyXEWf2XkjdQtUURfaE2h+oLCMjLXtibFEyrFZbrRwRnmok/IYRWEBgdfc4KNIX1eUnDUGmrXOcvlSX+aNKJvcvvpWkq05ONhWt1vEbur/dfF45EnQ26Ti3qWKWLAKbSNxE2CDfw743TyuobIrluboKIi89KKt5XIuvKzk+9+wHNbkEac4ANdrSg7fHQvbdybDd0RZaGvC+45tBWqQWE+gWj+i8yV2zWMTPNDP4dW5/VL9UNfW+qSohSXLeH4Lvvjng/RmihGj7RHdViLLpPXyCEz3TaZ54ptEA6sWQe9E5+bGQqq53s3gfynxNNzRHvjXs3ClbJ1Tmy4ufMfZhXclmXeQcZiQ5ApJrwIJXBJrbJStLYHa6S87pwHrvTHPIjNEDTd8ENR46cWSYQSNTcGfoH/uy97oYavlG5rig5aIyfn7TY3D88xA6aZroC4z3sp5bCNVfg38/xgayf4zU/MUVKFVPS0jZNrlprdcDLWYOHk/L0FIeywq8lEScJmLgM=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEXkgPIi52XRmGLoBe4TTWFpCrnd8yGnG7NvWUapNQP/
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOmaGoedEztxgDOz1INEcxcZgMaAWxlj1y2E0g229ftqLUY2EKcPZfT6cgmrqkEbWrkCEztXb2A9lJUl33iD0qo=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyq3G0L93/+v1EGdYspD4kb7+WpXY93kdki2f9H0IxIs/isbCFWjzegKhuklHQP5+7kBKqpIF+oX/Pg+pz3v2YbP69EmjVP596heA3s70WLxcx/EdeUmKnhvT3viJu4p5DoaUDRHbajRMoau/KdPe0UXgoZohR0ygDAbmThBAQkVgb0+Ozh+SvherWPK4Jix6j4MBWZVfe6v2hARhF8ouXZEDvK7N7lk3e8P9tdjJ4823aH5PxZVLoGtWNFHyhxfBeNdwV1W0ywFPhQHCRD5+hIPw3z6Bn6lDQhaBhCdaD8ndZ3Z9wS5v6bDSxAUEOdw1FVnSknbWw4Nasb5PhlyKmDs1EPhBkjaUAjUjSHep5f5HfY2IF7zq1ZQPZ3Grtd436ReWaGRiBRUvCOeV/ejx7vyCFfouLk+8gl2n/vyDAaP+ECB4zPFH/gO7OQlwOA1D3ws3jx+YgDhKVEC1gSG5RoxpCqIz2DeC7VaaTKTwgvIW14XpYAa2aFf3aljSsm4k=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAXvAeDUK5ns3/yTkp2YoR0gPU0QVt4fX7a1rVi3T8E1
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDxmJrXQY4Y5iAiPZUUgqtfOhHl1z3oYZ4kQWTolvmNKYa3Pvr3tn/YjtO7qA+27A960wRXb05YGNzcOjqt3quU=
                                             create=True mode=0644 path=/tmp/ansible.ixecdg0f state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:44 compute-0 sudo[74772]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:45 compute-0 sudo[74924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfsdejaufufhzkivrznfdpyevpirjdlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454944.7371557-94-113956584384563/AnsiballZ_command.py'
Jan 26 19:15:45 compute-0 sudo[74924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:45 compute-0 python3.9[74926]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.ixecdg0f' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:15:45 compute-0 sudo[74924]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:46 compute-0 sudo[75078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lawvixxfecjqmprdaoffnjwbemsogvwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454945.6491358-110-245799312917130/AnsiballZ_file.py'
Jan 26 19:15:46 compute-0 sudo[75078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:46 compute-0 python3.9[75080]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.ixecdg0f state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:46 compute-0 sudo[75078]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:46 compute-0 sshd-session[74164]: Connection closed by 192.168.122.30 port 49014
Jan 26 19:15:46 compute-0 sshd-session[74160]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:15:46 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Jan 26 19:15:46 compute-0 systemd[1]: session-16.scope: Consumed 4.045s CPU time.
Jan 26 19:15:46 compute-0 systemd-logind[794]: Session 16 logged out. Waiting for processes to exit.
Jan 26 19:15:46 compute-0 systemd-logind[794]: Removed session 16.
Jan 26 19:15:52 compute-0 sshd-session[75105]: Accepted publickey for zuul from 192.168.122.30 port 53776 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:15:52 compute-0 systemd-logind[794]: New session 17 of user zuul.
Jan 26 19:15:52 compute-0 systemd[1]: Started Session 17 of User zuul.
Jan 26 19:15:52 compute-0 sshd-session[75105]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:15:53 compute-0 python3.9[75258]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:15:54 compute-0 sudo[75412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcqtrqlswpvduscsvccubvdydloaigyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454953.5021603-39-186674436773409/AnsiballZ_systemd.py'
Jan 26 19:15:54 compute-0 sudo[75412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:54 compute-0 python3.9[75414]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 19:15:54 compute-0 sudo[75412]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:55 compute-0 sudo[75566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnsiiqjnzvoizshgimtwuheristhfwat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454954.769247-55-111800348685369/AnsiballZ_systemd.py'
Jan 26 19:15:55 compute-0 sudo[75566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:55 compute-0 python3.9[75568]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:15:55 compute-0 sudo[75566]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:56 compute-0 sudo[75719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbwzygakkdimudqynzgfwkyzrqwvxofq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454955.7034209-73-211082902346801/AnsiballZ_command.py'
Jan 26 19:15:56 compute-0 sudo[75719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:56 compute-0 python3.9[75721]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:15:56 compute-0 sudo[75719]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:57 compute-0 sudo[75872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcbexeysofievdambrmyqrifczmxvxrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454956.640797-89-79097862285944/AnsiballZ_stat.py'
Jan 26 19:15:57 compute-0 sudo[75872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:57 compute-0 python3.9[75874]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:15:57 compute-0 sudo[75872]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:57 compute-0 sudo[76026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szqklttrecligqppyjaaywifiqmytnzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454957.5510485-105-192976952453384/AnsiballZ_command.py'
Jan 26 19:15:57 compute-0 sudo[76026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:58 compute-0 python3.9[76028]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:15:58 compute-0 sudo[76026]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:58 compute-0 sudo[76181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdziuoozlrgmcymwcoqxwbiaeaixiqts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454958.3277268-121-163165767604109/AnsiballZ_file.py'
Jan 26 19:15:58 compute-0 sudo[76181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:15:59 compute-0 python3.9[76183]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:15:59 compute-0 sudo[76181]: pam_unix(sudo:session): session closed for user root
Jan 26 19:15:59 compute-0 sshd-session[75108]: Connection closed by 192.168.122.30 port 53776
Jan 26 19:15:59 compute-0 sshd-session[75105]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:15:59 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Jan 26 19:15:59 compute-0 systemd[1]: session-17.scope: Consumed 5.328s CPU time.
Jan 26 19:15:59 compute-0 systemd-logind[794]: Session 17 logged out. Waiting for processes to exit.
Jan 26 19:15:59 compute-0 systemd-logind[794]: Removed session 17.
Jan 26 19:16:05 compute-0 sshd-session[76208]: Accepted publickey for zuul from 192.168.122.30 port 39668 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:16:05 compute-0 systemd-logind[794]: New session 18 of user zuul.
Jan 26 19:16:05 compute-0 systemd[1]: Started Session 18 of User zuul.
Jan 26 19:16:05 compute-0 sshd-session[76208]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:16:06 compute-0 python3.9[76361]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:16:07 compute-0 sudo[76515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzlmvksonzxpjtlmgbxfsriaegvkpwrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454967.0988133-43-270887605190754/AnsiballZ_setup.py'
Jan 26 19:16:07 compute-0 sudo[76515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:07 compute-0 python3.9[76517]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:16:08 compute-0 sudo[76515]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:08 compute-0 sudo[76599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfcrstyznzrthzitakuotyjxjjcznxaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454967.0988133-43-270887605190754/AnsiballZ_dnf.py'
Jan 26 19:16:08 compute-0 sudo[76599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:08 compute-0 python3.9[76601]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 26 19:16:09 compute-0 sudo[76599]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:10 compute-0 python3.9[76752]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:16:12 compute-0 python3.9[76903]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 19:16:13 compute-0 python3.9[77053]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:16:13 compute-0 python3.9[77203]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:16:14 compute-0 sshd-session[76211]: Connection closed by 192.168.122.30 port 39668
Jan 26 19:16:14 compute-0 sshd-session[76208]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:16:14 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Jan 26 19:16:14 compute-0 systemd[1]: session-18.scope: Consumed 6.840s CPU time.
Jan 26 19:16:14 compute-0 systemd-logind[794]: Session 18 logged out. Waiting for processes to exit.
Jan 26 19:16:14 compute-0 systemd-logind[794]: Removed session 18.
Jan 26 19:16:19 compute-0 sshd-session[77228]: Accepted publickey for zuul from 192.168.122.30 port 54596 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:16:19 compute-0 systemd-logind[794]: New session 19 of user zuul.
Jan 26 19:16:19 compute-0 systemd[1]: Started Session 19 of User zuul.
Jan 26 19:16:19 compute-0 sshd-session[77228]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:16:20 compute-0 python3.9[77381]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:16:22 compute-0 sudo[77535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gazyqddnwzfbtygoxcopyjdrdikhftit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454981.733464-76-66895473951930/AnsiballZ_file.py'
Jan 26 19:16:22 compute-0 sudo[77535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:22 compute-0 python3.9[77537]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:22 compute-0 sudo[77535]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:22 compute-0 sudo[77687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qynmfdhcsftikpvcoztnjthdlfhbjngx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454982.6050794-76-126702727483332/AnsiballZ_file.py'
Jan 26 19:16:22 compute-0 sudo[77687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:23 compute-0 python3.9[77689]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:23 compute-0 sudo[77687]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:23 compute-0 sudo[77839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyfpkvbezkrdmhwuwldktkfpampqegpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454983.35563-108-257181724369018/AnsiballZ_stat.py'
Jan 26 19:16:23 compute-0 sudo[77839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:23 compute-0 python3.9[77841]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:23 compute-0 sudo[77839]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:24 compute-0 sudo[77962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrhqhtppxibwecnlynvtfjyxuiyextlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454983.35563-108-257181724369018/AnsiballZ_copy.py'
Jan 26 19:16:24 compute-0 sudo[77962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:24 compute-0 python3.9[77964]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454983.35563-108-257181724369018/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=420e9277f36dfa747534134060dfe9bf783cab39 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:24 compute-0 sudo[77962]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:25 compute-0 sudo[78114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffwmdkwychvnebknjxwdrormrfbsslbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454984.856677-108-61189863527048/AnsiballZ_stat.py'
Jan 26 19:16:25 compute-0 sudo[78114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:25 compute-0 python3.9[78116]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:25 compute-0 sudo[78114]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:25 compute-0 sudo[78237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftyyvbfkkngdrljfsruashdkanchkjjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454984.856677-108-61189863527048/AnsiballZ_copy.py'
Jan 26 19:16:25 compute-0 sudo[78237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:26 compute-0 python3.9[78239]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454984.856677-108-61189863527048/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=8b5793ac5f787c7ef460f291baf10e9299a1e767 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:26 compute-0 sudo[78237]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:26 compute-0 sudo[78389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahmhsnhwbbopvmtmrzvnqjlwmyafclgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454986.3501005-108-65998571346547/AnsiballZ_stat.py'
Jan 26 19:16:26 compute-0 sudo[78389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:26 compute-0 python3.9[78391]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:26 compute-0 sudo[78389]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:27 compute-0 sudo[78512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvgrbjgkdqhadfypecihscecpgbhfcug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454986.3501005-108-65998571346547/AnsiballZ_copy.py'
Jan 26 19:16:27 compute-0 sudo[78512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:27 compute-0 python3.9[78514]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454986.3501005-108-65998571346547/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=6b4045b6d42da07ebfcf1486d9a5bc007ee082aa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:27 compute-0 sudo[78512]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:28 compute-0 sudo[78664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zayovcqkmjbistlkpaemrrlprkmwzkoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454987.803095-201-281247264079170/AnsiballZ_file.py'
Jan 26 19:16:28 compute-0 sudo[78664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:28 compute-0 python3.9[78666]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:28 compute-0 sudo[78664]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:28 compute-0 sudo[78816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lftxibbtttikjmxacyrhjoqklnxcrkjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454988.5293531-201-208000922169491/AnsiballZ_file.py'
Jan 26 19:16:28 compute-0 sudo[78816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:29 compute-0 python3.9[78818]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:29 compute-0 sudo[78816]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:29 compute-0 sudo[78968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oboqoqmtonkhbqvvztxjotkiddivfvgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454989.2608826-236-31929799413684/AnsiballZ_stat.py'
Jan 26 19:16:29 compute-0 sudo[78968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:29 compute-0 python3.9[78970]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:29 compute-0 sudo[78968]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:30 compute-0 sudo[79091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxeitpdvmozpuenkknutjbietbndccdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454989.2608826-236-31929799413684/AnsiballZ_copy.py'
Jan 26 19:16:30 compute-0 sudo[79091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:30 compute-0 python3.9[79093]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454989.2608826-236-31929799413684/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=a6f49359f5b1c97f037af62cc87f7e1186bb63e6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:30 compute-0 sudo[79091]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:31 compute-0 sudo[79243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hasegdoakbtdvsmtfdwvfaxgluphmkhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454990.7601106-236-252177686401970/AnsiballZ_stat.py'
Jan 26 19:16:31 compute-0 sudo[79243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:31 compute-0 python3.9[79245]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:31 compute-0 sudo[79243]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:31 compute-0 sudo[79366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djovcjakatfuygxpoqflibmpwngxkexe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454990.7601106-236-252177686401970/AnsiballZ_copy.py'
Jan 26 19:16:31 compute-0 sudo[79366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:31 compute-0 python3.9[79368]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454990.7601106-236-252177686401970/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=ef705fe6e70e9c7514da0279f49f0326803b0dfc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:32 compute-0 sudo[79366]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:32 compute-0 sudo[79518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwbqtugmmzcpauhpzbdehprkirwfymkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454992.1693513-236-96528420181661/AnsiballZ_stat.py'
Jan 26 19:16:32 compute-0 sudo[79518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:32 compute-0 python3.9[79520]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:32 compute-0 sudo[79518]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:33 compute-0 sudo[79641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbqynlrbduwmaczrfjcvuiqzqlyizoor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454992.1693513-236-96528420181661/AnsiballZ_copy.py'
Jan 26 19:16:33 compute-0 sudo[79641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:33 compute-0 python3.9[79643]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454992.1693513-236-96528420181661/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=656194db0073207e1c1433286e335638148dd899 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:33 compute-0 sudo[79641]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:34 compute-0 sudo[79793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijewqehawqhwutximewxcxoktjhqabnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454993.6743138-333-26902040164692/AnsiballZ_file.py'
Jan 26 19:16:34 compute-0 sudo[79793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:34 compute-0 python3.9[79795]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:34 compute-0 sudo[79793]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:34 compute-0 sudo[79945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmriyqkjcmymxjbskvbbpcrrhwekbqdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454994.4463062-333-85244312347044/AnsiballZ_file.py'
Jan 26 19:16:34 compute-0 sudo[79945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:35 compute-0 python3.9[79947]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:35 compute-0 sudo[79945]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:35 compute-0 sudo[80097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lloqvbpunbxpjmezouiyonanwrgotpdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454995.228278-362-215586964713054/AnsiballZ_stat.py'
Jan 26 19:16:35 compute-0 sudo[80097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:35 compute-0 python3.9[80099]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:35 compute-0 sudo[80097]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:36 compute-0 sudo[80220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqkdcpmwvnurvnqtppetdpobmgvrfpfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454995.228278-362-215586964713054/AnsiballZ_copy.py'
Jan 26 19:16:36 compute-0 sudo[80220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:36 compute-0 python3.9[80222]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454995.228278-362-215586964713054/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=36a3ce89e40318e522ce56e0053b8ddaedd032a2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:36 compute-0 sudo[80220]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:37 compute-0 sudo[80372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xswhrxcikwmynfclmirnfjbvjbyqjurh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454996.8193443-362-112006125481968/AnsiballZ_stat.py'
Jan 26 19:16:37 compute-0 sudo[80372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:37 compute-0 python3.9[80374]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:37 compute-0 sudo[80372]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:37 compute-0 sudo[80495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exaogeqdxpkawylwqbqadqjyhlxwadbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454996.8193443-362-112006125481968/AnsiballZ_copy.py'
Jan 26 19:16:37 compute-0 sudo[80495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:37 compute-0 python3.9[80497]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454996.8193443-362-112006125481968/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=ee0f2b28ff1a457664aa8d7e17fb46d64a49ffc6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:38 compute-0 sudo[80495]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:38 compute-0 sudo[80647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdabhzsjlmqkkqhkqwfxrjrndwvvrwml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454998.1311421-362-437929703668/AnsiballZ_stat.py'
Jan 26 19:16:38 compute-0 sudo[80647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:38 compute-0 python3.9[80649]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:38 compute-0 sudo[80647]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:39 compute-0 sudo[80770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdlfgrsbicvsnirrisoztxxgmwtmzcip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454998.1311421-362-437929703668/AnsiballZ_copy.py'
Jan 26 19:16:39 compute-0 sudo[80770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:39 compute-0 python3.9[80772]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769454998.1311421-362-437929703668/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=95fd778adf0f6ae054ab020f5a0559d2a0062cfe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:39 compute-0 sudo[80770]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:39 compute-0 sudo[80922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfufjbdnlbdrtsshayljaierouwgcnrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769454999.5263283-458-278499016245887/AnsiballZ_file.py'
Jan 26 19:16:39 compute-0 sudo[80922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:40 compute-0 python3.9[80924]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:40 compute-0 sudo[80922]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:40 compute-0 sudo[81074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnfqmpuqcaipluerkoueiveqzfgncold ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455000.3403053-458-123954699493705/AnsiballZ_file.py'
Jan 26 19:16:40 compute-0 sudo[81074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:40 compute-0 python3.9[81076]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:40 compute-0 sudo[81074]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:41 compute-0 sudo[81226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyamjsetroyizthyhbwguxsfwwzvdpma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455001.11261-489-26231167081438/AnsiballZ_stat.py'
Jan 26 19:16:41 compute-0 sudo[81226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:41 compute-0 chronyd[64940]: Selected source 51.222.111.13 (pool.ntp.org)
Jan 26 19:16:41 compute-0 python3.9[81228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:41 compute-0 sudo[81226]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:42 compute-0 sudo[81349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqlagvivopugbqzatvjcrfmcqmpqloeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455001.11261-489-26231167081438/AnsiballZ_copy.py'
Jan 26 19:16:42 compute-0 sudo[81349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:42 compute-0 python3.9[81351]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455001.11261-489-26231167081438/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=feea0c6f22e3b2cb614e4d67719a4451fb8bb71f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:42 compute-0 sudo[81349]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:42 compute-0 sudo[81501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndbrsjaxsmkztbbdzlhujcgnllyxdbkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455002.4878087-489-187856258718720/AnsiballZ_stat.py'
Jan 26 19:16:42 compute-0 sudo[81501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:43 compute-0 python3.9[81503]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:43 compute-0 sudo[81501]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:43 compute-0 sudo[81624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwshiyavsveflaoghjizmjmkaswjbnrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455002.4878087-489-187856258718720/AnsiballZ_copy.py'
Jan 26 19:16:43 compute-0 sudo[81624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:43 compute-0 python3.9[81626]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455002.4878087-489-187856258718720/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=ee0f2b28ff1a457664aa8d7e17fb46d64a49ffc6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:43 compute-0 sudo[81624]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:44 compute-0 sudo[81776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqimdmvkwwmvwciqsgkeosilovlkjsyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455003.9014592-489-138969478783809/AnsiballZ_stat.py'
Jan 26 19:16:44 compute-0 sudo[81776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:44 compute-0 python3.9[81778]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:44 compute-0 sudo[81776]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:44 compute-0 sudo[81899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yijyuuhcmbagabzhildydutkkwehwtzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455003.9014592-489-138969478783809/AnsiballZ_copy.py'
Jan 26 19:16:44 compute-0 sudo[81899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:44 compute-0 python3.9[81901]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455003.9014592-489-138969478783809/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=1e62187bf7683f04e9d1b4ac25b9cd244d7cef32 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:45 compute-0 sudo[81899]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:46 compute-0 sudo[82051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxzyqolyuouhvzcndxhtdwbilfesdfvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455005.9022229-617-39305381186280/AnsiballZ_file.py'
Jan 26 19:16:46 compute-0 sudo[82051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:46 compute-0 python3.9[82053]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:46 compute-0 sudo[82051]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:46 compute-0 sudo[82203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvijuquqtcsgzdlblydnfkdhohoyrcgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455006.6355114-633-32805996087425/AnsiballZ_stat.py'
Jan 26 19:16:46 compute-0 sudo[82203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:47 compute-0 python3.9[82205]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:47 compute-0 sudo[82203]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:47 compute-0 sudo[82326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwhcjnkaclircepnsmmhdeawqteueysc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455006.6355114-633-32805996087425/AnsiballZ_copy.py'
Jan 26 19:16:47 compute-0 sudo[82326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:47 compute-0 python3.9[82328]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455006.6355114-633-32805996087425/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f27e502916f2e446220b4e099d9f250733d415ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:47 compute-0 sudo[82326]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:48 compute-0 sudo[82478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwiyggksiasyafwuxcbavsbjystwxnrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455008.2217104-666-144132154415966/AnsiballZ_file.py'
Jan 26 19:16:48 compute-0 sudo[82478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:48 compute-0 python3.9[82480]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:48 compute-0 sudo[82478]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:49 compute-0 sudo[82630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dugfobcemjetneckqzzjagpynimgmdhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455009.0038226-682-79803866966476/AnsiballZ_stat.py'
Jan 26 19:16:49 compute-0 sudo[82630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:49 compute-0 python3.9[82632]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:49 compute-0 sudo[82630]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:50 compute-0 sudo[82753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcqkwntagsntrabqvzjsbekawuzickeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455009.0038226-682-79803866966476/AnsiballZ_copy.py'
Jan 26 19:16:50 compute-0 sudo[82753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:50 compute-0 python3.9[82755]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455009.0038226-682-79803866966476/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f27e502916f2e446220b4e099d9f250733d415ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:50 compute-0 sudo[82753]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:50 compute-0 sudo[82905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yujyawnzkpzfkuktdgyaqessezxevvwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455010.4759083-714-194303283259814/AnsiballZ_file.py'
Jan 26 19:16:50 compute-0 sudo[82905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:51 compute-0 python3.9[82907]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:51 compute-0 sudo[82905]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:51 compute-0 sudo[83057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uespbtopxdkbblinbkjbtdstgfctudui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455011.2089682-729-92657137098813/AnsiballZ_stat.py'
Jan 26 19:16:51 compute-0 sudo[83057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:51 compute-0 python3.9[83059]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:51 compute-0 sudo[83057]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:52 compute-0 sudo[83180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydogushjwwcpmbjasklpohoiuculgjbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455011.2089682-729-92657137098813/AnsiballZ_copy.py'
Jan 26 19:16:52 compute-0 sudo[83180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:52 compute-0 python3.9[83182]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455011.2089682-729-92657137098813/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f27e502916f2e446220b4e099d9f250733d415ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:52 compute-0 sudo[83180]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:53 compute-0 sudo[83332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evnmwemoxswcxgqylikxzokfafdketiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455012.6716928-762-30706662987415/AnsiballZ_file.py'
Jan 26 19:16:53 compute-0 sudo[83332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:53 compute-0 python3.9[83334]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:53 compute-0 sudo[83332]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:53 compute-0 sudo[83484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egvdlnyecspzfasoxkkwdtgqcfflsofs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455013.487483-777-182484884359885/AnsiballZ_stat.py'
Jan 26 19:16:53 compute-0 sudo[83484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:54 compute-0 python3.9[83486]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:54 compute-0 sudo[83484]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:54 compute-0 sudo[83607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgchozkakezvpeabdallvvdmntkjolnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455013.487483-777-182484884359885/AnsiballZ_copy.py'
Jan 26 19:16:54 compute-0 sudo[83607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:54 compute-0 python3.9[83609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455013.487483-777-182484884359885/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f27e502916f2e446220b4e099d9f250733d415ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:54 compute-0 sudo[83607]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:55 compute-0 sudo[83759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imyuehhnttunnduviuxwwdpwozpaiime ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455014.9958954-813-177754841265798/AnsiballZ_file.py'
Jan 26 19:16:55 compute-0 sudo[83759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:55 compute-0 python3.9[83761]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:55 compute-0 sudo[83759]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:56 compute-0 sudo[83911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfncbffgdxewlblwhcecyfsrohffjhzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455015.969519-827-219706973504379/AnsiballZ_stat.py'
Jan 26 19:16:56 compute-0 sudo[83911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:56 compute-0 python3.9[83913]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:56 compute-0 sudo[83911]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:56 compute-0 sudo[84034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imdhxetbzesaqsyevcuebmoamlktyjly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455015.969519-827-219706973504379/AnsiballZ_copy.py'
Jan 26 19:16:56 compute-0 sudo[84034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:57 compute-0 python3.9[84036]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455015.969519-827-219706973504379/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f27e502916f2e446220b4e099d9f250733d415ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:57 compute-0 sudo[84034]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:57 compute-0 sudo[84186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hempufqzozdehaqpemlxiwtaxzcqqreu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455017.4041662-861-91675682075240/AnsiballZ_file.py'
Jan 26 19:16:57 compute-0 sudo[84186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:57 compute-0 python3.9[84188]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:16:58 compute-0 sudo[84186]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:58 compute-0 sudo[84338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwgqipfarukrjzyxbdgqknzbvgdqaqvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455018.1725392-874-261162660258485/AnsiballZ_stat.py'
Jan 26 19:16:58 compute-0 sudo[84338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:58 compute-0 python3.9[84340]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:16:58 compute-0 sudo[84338]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:59 compute-0 sudo[84461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svlzsapzbafqiuxjdktekcerbjjklseh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455018.1725392-874-261162660258485/AnsiballZ_copy.py'
Jan 26 19:16:59 compute-0 sudo[84461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:16:59 compute-0 python3.9[84463]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455018.1725392-874-261162660258485/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f27e502916f2e446220b4e099d9f250733d415ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:16:59 compute-0 sudo[84461]: pam_unix(sudo:session): session closed for user root
Jan 26 19:16:59 compute-0 sudo[84613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxasvupwezquawfcwxrlxibilzisieuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455019.7063026-890-121634965850481/AnsiballZ_file.py'
Jan 26 19:16:59 compute-0 sudo[84613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:00 compute-0 python3.9[84615]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:17:00 compute-0 sudo[84613]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:00 compute-0 sudo[84765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryekdqetmbpbfbvyxkmrpoaihrvdwmog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455020.3866408-898-250087632014353/AnsiballZ_stat.py'
Jan 26 19:17:00 compute-0 sudo[84765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:00 compute-0 python3.9[84767]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:00 compute-0 sudo[84765]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:01 compute-0 sudo[84888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qycmtyucgkunbytqmgkufwwuebrechtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455020.3866408-898-250087632014353/AnsiballZ_copy.py'
Jan 26 19:17:01 compute-0 sudo[84888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:01 compute-0 python3.9[84890]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455020.3866408-898-250087632014353/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f27e502916f2e446220b4e099d9f250733d415ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:01 compute-0 sudo[84888]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:02 compute-0 sshd-session[77231]: Connection closed by 192.168.122.30 port 54596
Jan 26 19:17:02 compute-0 sshd-session[77228]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:17:02 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Jan 26 19:17:02 compute-0 systemd[1]: session-19.scope: Consumed 34.380s CPU time.
Jan 26 19:17:02 compute-0 systemd-logind[794]: Session 19 logged out. Waiting for processes to exit.
Jan 26 19:17:02 compute-0 systemd-logind[794]: Removed session 19.
Jan 26 19:17:07 compute-0 sshd-session[84915]: Accepted publickey for zuul from 192.168.122.30 port 58116 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:17:07 compute-0 systemd-logind[794]: New session 20 of user zuul.
Jan 26 19:17:07 compute-0 systemd[1]: Started Session 20 of User zuul.
Jan 26 19:17:07 compute-0 sshd-session[84915]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:17:08 compute-0 python3.9[85068]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:17:09 compute-0 sudo[85222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-objahjthyszutzidmjfnbrbwzcassofa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455028.62875-43-258191951254542/AnsiballZ_file.py'
Jan 26 19:17:09 compute-0 sudo[85222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:09 compute-0 python3.9[85224]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:17:09 compute-0 sudo[85222]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:09 compute-0 sudo[85374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ichuikxebnyfxakpvumhmtzibdtvrmmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455029.538678-43-98875944992017/AnsiballZ_file.py'
Jan 26 19:17:09 compute-0 sudo[85374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:10 compute-0 python3.9[85376]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:17:10 compute-0 sudo[85374]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:10 compute-0 python3.9[85526]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:17:11 compute-0 sudo[85676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwiuzssioajygvtchrihtclwnstkfpiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455031.175479-89-35245224106272/AnsiballZ_seboolean.py'
Jan 26 19:17:11 compute-0 sudo[85676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:11 compute-0 python3.9[85678]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 26 19:17:13 compute-0 sudo[85676]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:13 compute-0 sudo[85832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyrsbwvhvohjpcumkrguhluafncffgob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455033.4314988-109-242456649219947/AnsiballZ_setup.py'
Jan 26 19:17:13 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 26 19:17:13 compute-0 sudo[85832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:14 compute-0 python3.9[85834]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:17:14 compute-0 sudo[85832]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:14 compute-0 sudo[85916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttnvkobxhzghltgfitcjknxjvaoawqbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455033.4314988-109-242456649219947/AnsiballZ_dnf.py'
Jan 26 19:17:14 compute-0 sudo[85916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:14 compute-0 python3.9[85918]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:17:16 compute-0 sudo[85916]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:17 compute-0 sudo[86069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hovbyrxeaeansavcnvmoxbekrrehezcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455036.5034726-133-91254999224136/AnsiballZ_systemd.py'
Jan 26 19:17:17 compute-0 sudo[86069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:17 compute-0 python3.9[86071]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 19:17:17 compute-0 sudo[86069]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:18 compute-0 sudo[86224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eanzgtyfeqnmftlttsosahxkzkrembpf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769455037.8443394-149-184374887773598/AnsiballZ_edpm_nftables_snippet.py'
Jan 26 19:17:18 compute-0 sudo[86224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:18 compute-0 python3[86226]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 26 19:17:18 compute-0 sudo[86224]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:19 compute-0 sudo[86376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgwdrnvhujpallrecgikudlpbdmdvgmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455038.9563725-167-217325727070938/AnsiballZ_file.py'
Jan 26 19:17:19 compute-0 sudo[86376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:19 compute-0 python3.9[86378]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:19 compute-0 sudo[86376]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:20 compute-0 sudo[86528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuwteeaovqoyaqfuyfmzkeykyquejyxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455039.8673227-183-237345186221404/AnsiballZ_stat.py'
Jan 26 19:17:20 compute-0 sudo[86528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:20 compute-0 python3.9[86530]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:20 compute-0 sudo[86528]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:20 compute-0 sudo[86606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emtfmlzsybmgyiprpwcgjbgqpnwjnrfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455039.8673227-183-237345186221404/AnsiballZ_file.py'
Jan 26 19:17:20 compute-0 sudo[86606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:21 compute-0 python3.9[86608]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:21 compute-0 sudo[86606]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:21 compute-0 sudo[86758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vncqmyvbdawoqiyoyzvhctyyyrvujpmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455041.3236828-207-21548939750366/AnsiballZ_stat.py'
Jan 26 19:17:21 compute-0 sudo[86758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:21 compute-0 python3.9[86760]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:21 compute-0 sudo[86758]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:22 compute-0 sudo[86836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cogkhzxkdfhdlhfhhmaskhkxmdnvcdku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455041.3236828-207-21548939750366/AnsiballZ_file.py'
Jan 26 19:17:22 compute-0 sudo[86836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:22 compute-0 python3.9[86838]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.s0vrdayj recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:22 compute-0 sudo[86836]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:23 compute-0 sudo[86988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-silozlkptkkhxsiysgscuqhtdqmrbeov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455042.6597211-231-261005827589023/AnsiballZ_stat.py'
Jan 26 19:17:23 compute-0 sudo[86988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:23 compute-0 python3.9[86990]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:23 compute-0 sudo[86988]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:23 compute-0 sudo[87066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmsshfenqwgjzabywxjhnreovacqvdbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455042.6597211-231-261005827589023/AnsiballZ_file.py'
Jan 26 19:17:23 compute-0 sudo[87066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:23 compute-0 python3.9[87068]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:23 compute-0 sudo[87066]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:24 compute-0 sudo[87218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wppsbaybuyhqghgezncjzgbrcgxdtslc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455043.936572-257-160147227488556/AnsiballZ_command.py'
Jan 26 19:17:24 compute-0 sudo[87218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:24 compute-0 python3.9[87220]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:17:24 compute-0 sudo[87218]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:25 compute-0 sudo[87371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjkpeyrcjuavazcbamknullkekdcwcjm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769455044.836974-273-60384246244997/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 19:17:25 compute-0 sudo[87371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:25 compute-0 python3[87373]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 19:17:25 compute-0 sudo[87371]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:26 compute-0 sudo[87523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxmaouctkowyazbuidchnjtsjpvfvczd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455045.7876544-289-111758030634821/AnsiballZ_stat.py'
Jan 26 19:17:26 compute-0 sudo[87523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:26 compute-0 python3.9[87525]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:26 compute-0 sudo[87523]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:27 compute-0 sudo[87648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnrxrozmdqamxhofublypplkmecqpzkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455045.7876544-289-111758030634821/AnsiballZ_copy.py'
Jan 26 19:17:27 compute-0 sudo[87648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:27 compute-0 python3.9[87650]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455045.7876544-289-111758030634821/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:27 compute-0 sudo[87648]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:28 compute-0 sudo[87800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jawimtwzobjjiqbrydxbavlowtuczmbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455048.31404-319-22766850673048/AnsiballZ_stat.py'
Jan 26 19:17:28 compute-0 sudo[87800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:28 compute-0 python3.9[87802]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:28 compute-0 sudo[87800]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:29 compute-0 sudo[87925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umarlkrouonrmcmhydfznudjhkrwyppf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455048.31404-319-22766850673048/AnsiballZ_copy.py'
Jan 26 19:17:29 compute-0 sudo[87925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:29 compute-0 python3.9[87927]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455048.31404-319-22766850673048/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:29 compute-0 sudo[87925]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:30 compute-0 sudo[88079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sncnqfksamxklsgkwwrqrbljdukjetni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455049.8605664-349-80224502600981/AnsiballZ_stat.py'
Jan 26 19:17:30 compute-0 sudo[88079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:30 compute-0 python3.9[88081]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:30 compute-0 sudo[88079]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:31 compute-0 sudo[88204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntokjlnyvntwqlyveirrhuobvqtqfvpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455049.8605664-349-80224502600981/AnsiballZ_copy.py'
Jan 26 19:17:31 compute-0 sudo[88204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:31 compute-0 python3.9[88206]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455049.8605664-349-80224502600981/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:31 compute-0 sudo[88204]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:31 compute-0 sshd-session[88004]: Invalid user hmsftp from 193.32.162.151 port 44544
Jan 26 19:17:31 compute-0 sudo[88356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxbazcrncoahinhqvyqmaaovelvtvwot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455051.6136143-379-42800785399689/AnsiballZ_stat.py'
Jan 26 19:17:31 compute-0 sudo[88356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:31 compute-0 sshd-session[88004]: Connection closed by invalid user hmsftp 193.32.162.151 port 44544 [preauth]
Jan 26 19:17:32 compute-0 python3.9[88358]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:32 compute-0 sudo[88356]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:32 compute-0 sudo[88481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orshniftkslgjymdwjsxuhaoxmfvblfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455051.6136143-379-42800785399689/AnsiballZ_copy.py'
Jan 26 19:17:32 compute-0 sudo[88481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:32 compute-0 python3.9[88483]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455051.6136143-379-42800785399689/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:32 compute-0 sudo[88481]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:33 compute-0 sudo[88633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhwsduxwxcnsmjmjgwmggjyuwwialdfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455052.9448805-409-121453711789531/AnsiballZ_stat.py'
Jan 26 19:17:33 compute-0 sudo[88633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:33 compute-0 python3.9[88635]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:33 compute-0 sudo[88633]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:33 compute-0 sudo[88758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyxlrqbggfqpqbcvmenazqjczpptqllf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455052.9448805-409-121453711789531/AnsiballZ_copy.py'
Jan 26 19:17:33 compute-0 sudo[88758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:34 compute-0 python3.9[88760]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455052.9448805-409-121453711789531/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:34 compute-0 sudo[88758]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:34 compute-0 sudo[88910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuafofzjixbsxqgiejaqumznhrrpbxar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455054.391589-439-159638675352432/AnsiballZ_file.py'
Jan 26 19:17:34 compute-0 sudo[88910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:34 compute-0 python3.9[88912]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:34 compute-0 sudo[88910]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:35 compute-0 sudo[89062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnfdfiflmybueumqzfncmdxnnhshytkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455055.1234593-455-212465793556975/AnsiballZ_command.py'
Jan 26 19:17:35 compute-0 sudo[89062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:35 compute-0 python3.9[89064]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:17:35 compute-0 sudo[89062]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:36 compute-0 sudo[89217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fohglxdtnsjufghvurssshwxwvrxydmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455055.8291783-471-272754114411057/AnsiballZ_blockinfile.py'
Jan 26 19:17:36 compute-0 sudo[89217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:36 compute-0 python3.9[89219]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:36 compute-0 sudo[89217]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:37 compute-0 sudo[89369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxngfhnubacymzybfszwzzvpnvwvwgjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455056.8513937-489-12519073774049/AnsiballZ_command.py'
Jan 26 19:17:37 compute-0 sudo[89369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:37 compute-0 python3.9[89371]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:17:37 compute-0 sudo[89369]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:37 compute-0 sudo[89522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spzsritombghzqdqzqgbndjvvikkxpqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455057.567452-505-153843287927350/AnsiballZ_stat.py'
Jan 26 19:17:37 compute-0 sudo[89522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:38 compute-0 python3.9[89524]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:17:38 compute-0 sudo[89522]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:38 compute-0 sudo[89676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enwrhfhhwebwrwxqondquzmhnbvbazbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455058.3368182-521-233299742997310/AnsiballZ_command.py'
Jan 26 19:17:38 compute-0 sudo[89676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:38 compute-0 python3.9[89678]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:17:38 compute-0 sudo[89676]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:39 compute-0 sudo[89831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnathalckjmpvbxhoczommawujpeioax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455059.1888013-537-204809181827345/AnsiballZ_file.py'
Jan 26 19:17:39 compute-0 sudo[89831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:39 compute-0 python3.9[89833]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:39 compute-0 sudo[89831]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:41 compute-0 python3.9[89983]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:17:42 compute-0 sudo[90134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wueibupnkqcdesotqyedgylnsbzetves ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455061.8063934-617-35052203155084/AnsiballZ_command.py'
Jan 26 19:17:42 compute-0 sudo[90134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:42 compute-0 python3.9[90136]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:17:42 compute-0 ovs-vsctl[90137]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 26 19:17:42 compute-0 sudo[90134]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:43 compute-0 sudo[90287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tltldchncnytoopsevjclptzztaqjxrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455062.6681788-635-230418075750393/AnsiballZ_command.py'
Jan 26 19:17:43 compute-0 sudo[90287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:43 compute-0 python3.9[90289]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:17:43 compute-0 sudo[90287]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:43 compute-0 sudo[90442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbpnobpcajpiurdayzbarjgkytgtumcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455063.4730308-651-36167527462355/AnsiballZ_command.py'
Jan 26 19:17:43 compute-0 sudo[90442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:44 compute-0 python3.9[90444]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:17:44 compute-0 ovs-vsctl[90445]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 26 19:17:44 compute-0 sudo[90442]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:44 compute-0 python3.9[90595]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:17:45 compute-0 sudo[90747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swbekxmoydojmzbhicxeutbbqdqawnum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455065.154324-685-228487052205463/AnsiballZ_file.py'
Jan 26 19:17:45 compute-0 sudo[90747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:45 compute-0 python3.9[90749]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:17:45 compute-0 sudo[90747]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:46 compute-0 sudo[90899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozinonxzgvlzkhxpsoneulpcaecwwwci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455065.983492-701-138993417342154/AnsiballZ_stat.py'
Jan 26 19:17:46 compute-0 sudo[90899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:46 compute-0 python3.9[90901]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:46 compute-0 sudo[90899]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:47 compute-0 sudo[90977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beusvaepnrbziwhqphljbisvewqrmdxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455065.983492-701-138993417342154/AnsiballZ_file.py'
Jan 26 19:17:47 compute-0 sudo[90977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:47 compute-0 python3.9[90979]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:17:47 compute-0 sudo[90977]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:47 compute-0 sudo[91129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehbyfylrpkpotclctiqwssrdftrnsxxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455067.4344606-701-47318413914771/AnsiballZ_stat.py'
Jan 26 19:17:47 compute-0 sudo[91129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:47 compute-0 python3.9[91131]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:48 compute-0 sudo[91129]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:48 compute-0 sudo[91207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nokrwhvymleqdvkbycwrxoziinaewgmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455067.4344606-701-47318413914771/AnsiballZ_file.py'
Jan 26 19:17:48 compute-0 sudo[91207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:48 compute-0 python3.9[91209]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:17:48 compute-0 sudo[91207]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:49 compute-0 sudo[91359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpgtqeamteyaufsknqjvcbynrtgztakk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455068.7909074-747-268983129013415/AnsiballZ_file.py'
Jan 26 19:17:49 compute-0 sudo[91359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:49 compute-0 python3.9[91361]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:49 compute-0 sudo[91359]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:49 compute-0 sudo[91511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwvyjsbyxecbepyxplhoccxykxkvvphh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455069.5819008-763-253143031537633/AnsiballZ_stat.py'
Jan 26 19:17:49 compute-0 sudo[91511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:50 compute-0 python3.9[91513]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:50 compute-0 sudo[91511]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:50 compute-0 sudo[91589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkgbvoqwvktjdlwjdwyexksvbptqzphm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455069.5819008-763-253143031537633/AnsiballZ_file.py'
Jan 26 19:17:50 compute-0 sudo[91589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:50 compute-0 python3.9[91591]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:50 compute-0 sudo[91589]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:51 compute-0 sudo[91741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmvhbhxzlyzprgekwfhukxloajionvle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455071.1370761-787-78796303152200/AnsiballZ_stat.py'
Jan 26 19:17:51 compute-0 sudo[91741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:51 compute-0 python3.9[91743]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:51 compute-0 sudo[91741]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:52 compute-0 sudo[91819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfaxmqrjcyjrizjoqqcuevzapplkdoex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455071.1370761-787-78796303152200/AnsiballZ_file.py'
Jan 26 19:17:52 compute-0 sudo[91819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:52 compute-0 python3.9[91821]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:52 compute-0 sudo[91819]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:52 compute-0 sudo[91971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okwmgckayqfslzaqhcxmodbodhszcovq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455072.5136821-811-71587367798223/AnsiballZ_systemd.py'
Jan 26 19:17:52 compute-0 sudo[91971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:53 compute-0 python3.9[91973]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:17:53 compute-0 systemd[1]: Reloading.
Jan 26 19:17:53 compute-0 systemd-rc-local-generator[91999]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:17:53 compute-0 systemd-sysv-generator[92002]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:17:53 compute-0 sudo[91971]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:54 compute-0 sudo[92160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xexwejhiexvddnlosziqysmzhsxhctym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455074.199108-827-219477569261137/AnsiballZ_stat.py'
Jan 26 19:17:54 compute-0 sudo[92160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:54 compute-0 python3.9[92162]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:54 compute-0 sudo[92160]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:55 compute-0 sudo[92238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feyklmgzvqoaypdnwkfipsyljwhxqiml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455074.199108-827-219477569261137/AnsiballZ_file.py'
Jan 26 19:17:55 compute-0 sudo[92238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:55 compute-0 python3.9[92240]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:55 compute-0 sudo[92238]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:55 compute-0 sudo[92390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfdqmbizdbcuhielqbmdeansmmexwioz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455075.5925598-851-126753144300748/AnsiballZ_stat.py'
Jan 26 19:17:55 compute-0 sudo[92390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:56 compute-0 python3.9[92392]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:56 compute-0 sudo[92390]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:56 compute-0 sudo[92468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znjlytcbmxhumgyznqwninmwylbvotdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455075.5925598-851-126753144300748/AnsiballZ_file.py'
Jan 26 19:17:56 compute-0 sudo[92468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:56 compute-0 python3.9[92470]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:17:56 compute-0 sudo[92468]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:57 compute-0 sudo[92620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oitjzwadeddddrjznobjpapfaawmxxwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455076.82967-875-129720546871703/AnsiballZ_systemd.py'
Jan 26 19:17:57 compute-0 sudo[92620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:57 compute-0 python3.9[92622]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:17:57 compute-0 systemd[1]: Reloading.
Jan 26 19:17:57 compute-0 systemd-rc-local-generator[92648]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:17:57 compute-0 systemd-sysv-generator[92651]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:17:57 compute-0 systemd[1]: Starting Create netns directory...
Jan 26 19:17:57 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 19:17:57 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 19:17:57 compute-0 systemd[1]: Finished Create netns directory.
Jan 26 19:17:57 compute-0 sudo[92620]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:58 compute-0 sudo[92813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhuywfdmfnqqfkxwmxjfbtddoaoekvgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455078.11669-895-145788312741752/AnsiballZ_file.py'
Jan 26 19:17:58 compute-0 sudo[92813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:58 compute-0 python3.9[92815]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:17:58 compute-0 sudo[92813]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:59 compute-0 sudo[92965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkzfitjrojcahmbijnoxwdrxaxmjqany ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455078.7831545-911-129839931809787/AnsiballZ_stat.py'
Jan 26 19:17:59 compute-0 sudo[92965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:17:59 compute-0 python3.9[92967]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:17:59 compute-0 sudo[92965]: pam_unix(sudo:session): session closed for user root
Jan 26 19:17:59 compute-0 sudo[93088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smczsdvlxwhmthygjgxcgjmjqzlhnqtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455078.7831545-911-129839931809787/AnsiballZ_copy.py'
Jan 26 19:17:59 compute-0 sudo[93088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:00 compute-0 python3.9[93090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455078.7831545-911-129839931809787/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:00 compute-0 sudo[93088]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:00 compute-0 sudo[93240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pprvyfhdisrpeuryoksjasbjxatkwiqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455080.477069-945-262082672811928/AnsiballZ_file.py'
Jan 26 19:18:00 compute-0 sudo[93240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:00 compute-0 python3.9[93242]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:18:00 compute-0 sudo[93240]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:01 compute-0 sudo[93392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmqbqmqmjrbehofcbzdfxiolrijdzgxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455081.1525705-961-44233597600476/AnsiballZ_file.py'
Jan 26 19:18:01 compute-0 sudo[93392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:01 compute-0 python3.9[93394]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:01 compute-0 sudo[93392]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:02 compute-0 sudo[93544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvidmkxwbkinyiaxepmwkucdpidzppdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455081.9460695-977-150548896949490/AnsiballZ_stat.py'
Jan 26 19:18:02 compute-0 sudo[93544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:02 compute-0 python3.9[93546]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:18:02 compute-0 sudo[93544]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:02 compute-0 sudo[93667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqbnacufhbdmffqbsgtanomozquyowwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455081.9460695-977-150548896949490/AnsiballZ_copy.py'
Jan 26 19:18:02 compute-0 sudo[93667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:03 compute-0 python3.9[93669]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455081.9460695-977-150548896949490/.source.json _original_basename=.667tgllr follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:18:03 compute-0 sudo[93667]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:03 compute-0 python3.9[93819]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:18:06 compute-0 sudo[94240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szqbaqyjcrjtrzbwqaalqcaamzpowcwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455085.5862138-1057-272667520606091/AnsiballZ_container_config_data.py'
Jan 26 19:18:06 compute-0 sudo[94240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:06 compute-0 python3.9[94242]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 26 19:18:06 compute-0 sudo[94240]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:07 compute-0 sudo[94392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avjscnkpewfntousvbpjiojzwmwbgkys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455086.6560814-1079-16844998812125/AnsiballZ_container_config_hash.py'
Jan 26 19:18:07 compute-0 sudo[94392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:07 compute-0 python3.9[94394]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 19:18:07 compute-0 sudo[94392]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:08 compute-0 sudo[94544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slcsmjsqfeuomakopplksgomljlrhmwd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769455087.7559729-1099-138028150015533/AnsiballZ_edpm_container_manage.py'
Jan 26 19:18:08 compute-0 sudo[94544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:08 compute-0 python3[94546]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 19:18:08 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:18:08 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:18:08 compute-0 podman[94582]: 2026-01-26 19:18:08.867853997 +0000 UTC m=+0.057316377 container create 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 19:18:08 compute-0 podman[94582]: 2026-01-26 19:18:08.842546109 +0000 UTC m=+0.032008499 image pull 241d2c1ab738336a495a3844d8edb58bb1ca6339db3c90d7e6fb4b3656492432 38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Jan 26 19:18:08 compute-0 python3[94546]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Jan 26 19:18:09 compute-0 sudo[94544]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:09 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 26 19:18:10 compute-0 sudo[94770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nupdxesyammhysodephphnrbgkjhgixk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455090.558096-1115-195906275812053/AnsiballZ_stat.py'
Jan 26 19:18:10 compute-0 sudo[94770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:11 compute-0 python3.9[94772]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:18:11 compute-0 sudo[94770]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:11 compute-0 sudo[94924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahjuexenhrlnhglwpgiczvvvlgwhwimr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455091.4624426-1133-199651085041736/AnsiballZ_file.py'
Jan 26 19:18:11 compute-0 sudo[94924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:11 compute-0 python3.9[94926]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:18:12 compute-0 sudo[94924]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:12 compute-0 sudo[95000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycxxgeumuesuojvszhhypccojdasnsvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455091.4624426-1133-199651085041736/AnsiballZ_stat.py'
Jan 26 19:18:12 compute-0 sudo[95000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:12 compute-0 python3.9[95002]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:18:12 compute-0 sudo[95000]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:13 compute-0 sudo[95151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcscpjfwmuyocmgholkrgztifupqalbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455092.556067-1133-115546875420236/AnsiballZ_copy.py'
Jan 26 19:18:13 compute-0 sudo[95151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:13 compute-0 python3.9[95153]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769455092.556067-1133-115546875420236/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:18:13 compute-0 sudo[95151]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:13 compute-0 sudo[95227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reoxibmulzhqnahaotcwbmytyyxvmbqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455092.556067-1133-115546875420236/AnsiballZ_systemd.py'
Jan 26 19:18:13 compute-0 sudo[95227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:13 compute-0 python3.9[95229]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 19:18:13 compute-0 systemd[1]: Reloading.
Jan 26 19:18:14 compute-0 systemd-rc-local-generator[95252]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:18:14 compute-0 systemd-sysv-generator[95256]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:18:14 compute-0 sudo[95227]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:14 compute-0 sudo[95338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjznyitzwsmlycufcjupnaxhteowqhdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455092.556067-1133-115546875420236/AnsiballZ_systemd.py'
Jan 26 19:18:14 compute-0 sudo[95338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:14 compute-0 python3.9[95340]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:18:14 compute-0 systemd[1]: Reloading.
Jan 26 19:18:14 compute-0 systemd-sysv-generator[95376]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:18:14 compute-0 systemd-rc-local-generator[95372]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:18:15 compute-0 systemd[1]: Starting ovn_controller container...
Jan 26 19:18:15 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 26 19:18:15 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:18:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8bcf0ea599600f5cfe7d8a8373d5161ae2930f638cc6b04969053cc7f18b1fb5/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 26 19:18:15 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172.
Jan 26 19:18:15 compute-0 podman[95381]: 2026-01-26 19:18:15.350711333 +0000 UTC m=+0.175279446 container init 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 19:18:15 compute-0 ovn_controller[95396]: + sudo -E kolla_set_configs
Jan 26 19:18:15 compute-0 podman[95381]: 2026-01-26 19:18:15.377762995 +0000 UTC m=+0.202331078 container start 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 26 19:18:15 compute-0 edpm-start-podman-container[95381]: ovn_controller
Jan 26 19:18:15 compute-0 systemd[1]: Created slice User Slice of UID 0.
Jan 26 19:18:15 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 26 19:18:15 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 26 19:18:15 compute-0 edpm-start-podman-container[95380]: Creating additional drop-in dependency for "ovn_controller" (790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172)
Jan 26 19:18:15 compute-0 systemd[1]: Starting User Manager for UID 0...
Jan 26 19:18:15 compute-0 systemd[95428]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 26 19:18:15 compute-0 systemd[1]: Reloading.
Jan 26 19:18:15 compute-0 podman[95402]: 2026-01-26 19:18:15.496092981 +0000 UTC m=+0.096446045 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 19:18:15 compute-0 systemd-rc-local-generator[95474]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:18:15 compute-0 systemd-sysv-generator[95477]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:18:15 compute-0 systemd[95428]: Queued start job for default target Main User Target.
Jan 26 19:18:15 compute-0 systemd[95428]: Created slice User Application Slice.
Jan 26 19:18:15 compute-0 systemd[95428]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 26 19:18:15 compute-0 systemd[95428]: Started Daily Cleanup of User's Temporary Directories.
Jan 26 19:18:15 compute-0 systemd[95428]: Reached target Paths.
Jan 26 19:18:15 compute-0 systemd[95428]: Reached target Timers.
Jan 26 19:18:15 compute-0 systemd[95428]: Starting D-Bus User Message Bus Socket...
Jan 26 19:18:15 compute-0 systemd[95428]: Starting Create User's Volatile Files and Directories...
Jan 26 19:18:15 compute-0 systemd[95428]: Listening on D-Bus User Message Bus Socket.
Jan 26 19:18:15 compute-0 systemd[95428]: Reached target Sockets.
Jan 26 19:18:15 compute-0 systemd[95428]: Finished Create User's Volatile Files and Directories.
Jan 26 19:18:15 compute-0 systemd[95428]: Reached target Basic System.
Jan 26 19:18:15 compute-0 systemd[95428]: Reached target Main User Target.
Jan 26 19:18:15 compute-0 systemd[95428]: Startup finished in 175ms.
Jan 26 19:18:15 compute-0 systemd[1]: Started User Manager for UID 0.
Jan 26 19:18:15 compute-0 systemd[1]: Started ovn_controller container.
Jan 26 19:18:15 compute-0 systemd[1]: 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172-7f59720445740562.service: Main process exited, code=exited, status=1/FAILURE
Jan 26 19:18:15 compute-0 systemd[1]: 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172-7f59720445740562.service: Failed with result 'exit-code'.
Jan 26 19:18:15 compute-0 systemd[1]: Started Session c1 of User root.
Jan 26 19:18:15 compute-0 sudo[95338]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:15 compute-0 ovn_controller[95396]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 19:18:15 compute-0 ovn_controller[95396]: INFO:__main__:Validating config file
Jan 26 19:18:15 compute-0 ovn_controller[95396]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 19:18:15 compute-0 ovn_controller[95396]: INFO:__main__:Writing out command to execute
Jan 26 19:18:15 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 26 19:18:15 compute-0 ovn_controller[95396]: ++ cat /run_command
Jan 26 19:18:15 compute-0 ovn_controller[95396]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 26 19:18:15 compute-0 ovn_controller[95396]: + ARGS=
Jan 26 19:18:15 compute-0 ovn_controller[95396]: + sudo kolla_copy_cacerts
Jan 26 19:18:15 compute-0 systemd[1]: Started Session c2 of User root.
Jan 26 19:18:15 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 26 19:18:15 compute-0 ovn_controller[95396]: + [[ ! -n '' ]]
Jan 26 19:18:15 compute-0 ovn_controller[95396]: + . kolla_extend_start
Jan 26 19:18:15 compute-0 ovn_controller[95396]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 26 19:18:15 compute-0 ovn_controller[95396]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 26 19:18:15 compute-0 ovn_controller[95396]: + umask 0022
Jan 26 19:18:15 compute-0 ovn_controller[95396]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Jan 26 19:18:15 compute-0 ovn_controller[95396]: 2026-01-26T19:18:15Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Jan 26 19:18:15 compute-0 NetworkManager[55489]: <info>  [1769455095.9313] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 26 19:18:15 compute-0 NetworkManager[55489]: <info>  [1769455095.9320] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 26 19:18:15 compute-0 NetworkManager[55489]: <warn>  [1769455095.9323] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 26 19:18:15 compute-0 NetworkManager[55489]: <info>  [1769455095.9329] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 26 19:18:15 compute-0 NetworkManager[55489]: <info>  [1769455095.9333] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 26 19:18:15 compute-0 NetworkManager[55489]: <info>  [1769455095.9335] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 26 19:18:15 compute-0 kernel: br-int: entered promiscuous mode
Jan 26 19:18:15 compute-0 systemd-udevd[95524]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00025|main|INFO|OVS feature set changed, force recompute.
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00029|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00030|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00032|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00033|features|INFO|OVS Feature: meter_support, state: supported
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00034|features|INFO|OVS Feature: group_support, state: supported
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00035|main|INFO|OVS feature set changed, force recompute.
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 26 19:18:16 compute-0 ovn_controller[95396]: 2026-01-26T19:18:16Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 26 19:18:16 compute-0 NetworkManager[55489]: <info>  [1769455096.9686] manager: (ovn-8c0c8e-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 26 19:18:16 compute-0 NetworkManager[55489]: <info>  [1769455096.9966] device (genev_sys_6081): carrier: link connected
Jan 26 19:18:16 compute-0 NetworkManager[55489]: <info>  [1769455096.9969] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Jan 26 19:18:16 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Jan 26 19:18:16 compute-0 systemd-udevd[95526]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:18:19 compute-0 python3.9[95656]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 19:18:19 compute-0 NetworkManager[55489]: <info>  [1769455099.3851] manager: (ovn-c86a09-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 26 19:18:20 compute-0 sudo[95807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcdrrekeuicnvyhwjrepvsdfashhemom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455099.7950563-1223-192607373013963/AnsiballZ_stat.py'
Jan 26 19:18:20 compute-0 sudo[95807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:20 compute-0 python3.9[95809]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:18:20 compute-0 sudo[95807]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:20 compute-0 sudo[95930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgvxeaszzbitluvfcwdvnfoxrqqxrfub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455099.7950563-1223-192607373013963/AnsiballZ_copy.py'
Jan 26 19:18:20 compute-0 sudo[95930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:21 compute-0 python3.9[95932]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455099.7950563-1223-192607373013963/.source.yaml _original_basename=.ylolui73 follow=False checksum=d06791c81be0e4d2e110567057161f0a82bdf399 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:18:21 compute-0 sudo[95930]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:21 compute-0 sudo[96082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiacqdgvqpkrsomjbvfopmyiaoxwciqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455101.2155082-1253-159410601689864/AnsiballZ_command.py'
Jan 26 19:18:21 compute-0 sudo[96082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:21 compute-0 python3.9[96084]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:18:21 compute-0 ovs-vsctl[96085]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 26 19:18:21 compute-0 sudo[96082]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:22 compute-0 sudo[96235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edduyzmnxzvuislhxxgzpxlqttqperwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455101.9803052-1269-178966627963716/AnsiballZ_command.py'
Jan 26 19:18:22 compute-0 sudo[96235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:22 compute-0 python3.9[96237]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:18:22 compute-0 ovs-vsctl[96239]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 26 19:18:22 compute-0 sudo[96235]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:23 compute-0 sudo[96390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpnbwwelcrugclqzuvujhpkrpnbzozmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455103.0777457-1297-60551849586271/AnsiballZ_command.py'
Jan 26 19:18:23 compute-0 sudo[96390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:23 compute-0 python3.9[96392]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:18:23 compute-0 ovs-vsctl[96393]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 26 19:18:23 compute-0 sudo[96390]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:24 compute-0 sshd-session[84918]: Connection closed by 192.168.122.30 port 58116
Jan 26 19:18:24 compute-0 sshd-session[84915]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:18:24 compute-0 systemd-logind[794]: Session 20 logged out. Waiting for processes to exit.
Jan 26 19:18:24 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Jan 26 19:18:24 compute-0 systemd[1]: session-20.scope: Consumed 54.973s CPU time.
Jan 26 19:18:24 compute-0 systemd-logind[794]: Removed session 20.
Jan 26 19:18:25 compute-0 systemd[1]: Stopping User Manager for UID 0...
Jan 26 19:18:25 compute-0 systemd[95428]: Activating special unit Exit the Session...
Jan 26 19:18:25 compute-0 systemd[95428]: Stopped target Main User Target.
Jan 26 19:18:25 compute-0 systemd[95428]: Stopped target Basic System.
Jan 26 19:18:25 compute-0 systemd[95428]: Stopped target Paths.
Jan 26 19:18:25 compute-0 systemd[95428]: Stopped target Sockets.
Jan 26 19:18:25 compute-0 systemd[95428]: Stopped target Timers.
Jan 26 19:18:25 compute-0 systemd[95428]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 26 19:18:25 compute-0 systemd[95428]: Closed D-Bus User Message Bus Socket.
Jan 26 19:18:25 compute-0 systemd[95428]: Stopped Create User's Volatile Files and Directories.
Jan 26 19:18:25 compute-0 systemd[95428]: Removed slice User Application Slice.
Jan 26 19:18:25 compute-0 systemd[95428]: Reached target Shutdown.
Jan 26 19:18:25 compute-0 systemd[95428]: Finished Exit the Session.
Jan 26 19:18:25 compute-0 systemd[95428]: Reached target Exit the Session.
Jan 26 19:18:25 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Jan 26 19:18:25 compute-0 systemd[1]: Stopped User Manager for UID 0.
Jan 26 19:18:26 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 26 19:18:26 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 26 19:18:26 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 26 19:18:26 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 26 19:18:26 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Jan 26 19:18:27 compute-0 ovn_controller[95396]: 2026-01-26T19:18:27Z|00038|memory|INFO|15744 kB peak resident set size after 11.2 seconds
Jan 26 19:18:27 compute-0 ovn_controller[95396]: 2026-01-26T19:18:27Z|00039|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Jan 26 19:18:29 compute-0 sshd-session[96421]: Accepted publickey for zuul from 192.168.122.30 port 33586 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:18:29 compute-0 systemd-logind[794]: New session 22 of user zuul.
Jan 26 19:18:29 compute-0 systemd[1]: Started Session 22 of User zuul.
Jan 26 19:18:29 compute-0 sshd-session[96421]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:18:31 compute-0 python3.9[96574]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:18:32 compute-0 sudo[96728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unvssjvgfoewqhrkqirqscjqgqwcujmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455111.767251-43-137927252770621/AnsiballZ_file.py'
Jan 26 19:18:32 compute-0 sudo[96728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:32 compute-0 python3.9[96730]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:32 compute-0 sudo[96728]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:33 compute-0 sudo[96880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ougvctdcpcfftklpzgpialhkqctmpkrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455112.783319-43-204110293257566/AnsiballZ_file.py'
Jan 26 19:18:33 compute-0 sudo[96880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:33 compute-0 python3.9[96882]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:33 compute-0 sudo[96880]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:33 compute-0 sudo[97032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgvvitgqhickomrjseylezirufxyruhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455113.4951684-43-37887751509512/AnsiballZ_file.py'
Jan 26 19:18:33 compute-0 sudo[97032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:33 compute-0 python3.9[97034]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:33 compute-0 sudo[97032]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:34 compute-0 sudo[97184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjnnpzfraybufazaidoqjulnajjafwpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455114.1596355-43-85679928540336/AnsiballZ_file.py'
Jan 26 19:18:34 compute-0 sudo[97184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:34 compute-0 python3.9[97186]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:34 compute-0 sudo[97184]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:35 compute-0 sudo[97337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdsnvmpjvbgsaehnkeqtszrdbfatvczc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455114.9883654-43-150485919873400/AnsiballZ_file.py'
Jan 26 19:18:35 compute-0 sudo[97337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:35 compute-0 python3.9[97339]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:35 compute-0 sudo[97337]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:36 compute-0 python3.9[97489]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:18:37 compute-0 sudo[97639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jayrarzgpsltqkswrbipwyokaoubkhbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455116.6047802-131-247174925155808/AnsiballZ_seboolean.py'
Jan 26 19:18:37 compute-0 sudo[97639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:37 compute-0 python3.9[97641]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 26 19:18:37 compute-0 sudo[97639]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:38 compute-0 python3.9[97791]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:18:39 compute-0 python3.9[97912]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455118.1476579-147-188498365022740/.source follow=False _original_basename=haproxy.j2 checksum=463b2a8c5bc8d04192948e7dde86f83a5adcf7ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:40 compute-0 python3.9[98062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:18:40 compute-0 python3.9[98183]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455119.6854753-177-68209578277865/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:41 compute-0 sudo[98333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxbdverpqbiyksaxghdiglhjoyyzilhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455121.2652988-211-236729115390559/AnsiballZ_setup.py'
Jan 26 19:18:41 compute-0 sudo[98333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:41 compute-0 python3.9[98335]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:18:42 compute-0 sudo[98333]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:42 compute-0 sudo[98417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkppnsbinjcjmkgsgamampihayjplhdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455121.2652988-211-236729115390559/AnsiballZ_dnf.py'
Jan 26 19:18:42 compute-0 sudo[98417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:42 compute-0 python3.9[98419]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:18:44 compute-0 sudo[98417]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:45 compute-0 sudo[98570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xippvlfoxkbbcizqlxtpavbktakjsrdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455124.8302927-235-203918429898521/AnsiballZ_systemd.py'
Jan 26 19:18:45 compute-0 sudo[98570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:45 compute-0 python3.9[98572]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 19:18:45 compute-0 sudo[98570]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:46 compute-0 podman[98574]: 2026-01-26 19:18:46.030713574 +0000 UTC m=+0.186843006 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 19:18:46 compute-0 python3.9[98753]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:18:47 compute-0 python3.9[98874]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455126.150542-251-220928050824869/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:47 compute-0 python3.9[99024]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:18:48 compute-0 python3.9[99145]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455127.341927-251-83487948596096/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:49 compute-0 python3.9[99295]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:18:50 compute-0 python3.9[99416]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455129.2267919-339-103686421384642/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:51 compute-0 python3.9[99566]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:18:51 compute-0 python3.9[99687]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455130.4743915-339-11119031603500/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:52 compute-0 python3.9[99837]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:18:53 compute-0 sudo[99989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruvvunkzdurstbvghfgppnqugmcajfmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455132.6832652-415-248203178082951/AnsiballZ_file.py'
Jan 26 19:18:53 compute-0 sudo[99989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:53 compute-0 python3.9[99991]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:53 compute-0 sudo[99989]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:53 compute-0 sudo[100141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvvyowyalfzcwhvnvgnohnsozbhpxafd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455133.5163658-431-18205236779089/AnsiballZ_stat.py'
Jan 26 19:18:53 compute-0 sudo[100141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:54 compute-0 python3.9[100143]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:18:54 compute-0 sudo[100141]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:54 compute-0 sudo[100219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlgaqalgiqfwimnchyupfffisuzcescg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455133.5163658-431-18205236779089/AnsiballZ_file.py'
Jan 26 19:18:54 compute-0 sudo[100219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:54 compute-0 python3.9[100221]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:54 compute-0 sudo[100219]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:55 compute-0 sudo[100371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nletuzvuxigqjfrafqsjyusdkuzyxwbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455134.9540377-431-11013628452393/AnsiballZ_stat.py'
Jan 26 19:18:55 compute-0 sudo[100371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:55 compute-0 python3.9[100373]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:18:55 compute-0 sudo[100371]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:55 compute-0 sudo[100449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prxnyxnapiuegcscrupkisrbijnsyies ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455134.9540377-431-11013628452393/AnsiballZ_file.py'
Jan 26 19:18:55 compute-0 sudo[100449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:56 compute-0 python3.9[100451]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:18:56 compute-0 sudo[100449]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:56 compute-0 sudo[100601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmqsxhwwqpyminmtmghtynrxreghnhwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455136.3371143-477-221329762379159/AnsiballZ_file.py'
Jan 26 19:18:56 compute-0 sudo[100601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:56 compute-0 python3.9[100603]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:18:56 compute-0 sudo[100601]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:57 compute-0 sudo[100753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egasvocwphrvwosdaqokiflxxvtbzhws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455137.0816095-493-108907282699065/AnsiballZ_stat.py'
Jan 26 19:18:57 compute-0 sudo[100753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:57 compute-0 python3.9[100755]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:18:57 compute-0 sudo[100753]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:57 compute-0 sudo[100831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdkwzleawgqnvurvtotihxqpdzabtgjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455137.0816095-493-108907282699065/AnsiballZ_file.py'
Jan 26 19:18:57 compute-0 sudo[100831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:58 compute-0 python3.9[100833]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:18:58 compute-0 sudo[100831]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:58 compute-0 sudo[100983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbdshvlheqjzwqusiyszxwyafazrnfpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455138.3449497-517-105978436987071/AnsiballZ_stat.py'
Jan 26 19:18:58 compute-0 sudo[100983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:58 compute-0 python3.9[100985]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:18:58 compute-0 sudo[100983]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:59 compute-0 sudo[101061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqbqpycxqiwsmqtvurlhxkttbwesieet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455138.3449497-517-105978436987071/AnsiballZ_file.py'
Jan 26 19:18:59 compute-0 sudo[101061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:18:59 compute-0 python3.9[101063]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:18:59 compute-0 sudo[101061]: pam_unix(sudo:session): session closed for user root
Jan 26 19:18:59 compute-0 sudo[101213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzalkzbxxmrybgfdhypaqsqqyrunemlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455139.622508-541-230730059178383/AnsiballZ_systemd.py'
Jan 26 19:18:59 compute-0 sudo[101213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:00 compute-0 python3.9[101215]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:19:00 compute-0 systemd[1]: Reloading.
Jan 26 19:19:00 compute-0 systemd-sysv-generator[101242]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:19:00 compute-0 systemd-rc-local-generator[101239]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:19:00 compute-0 sudo[101213]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:01 compute-0 sudo[101402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgdxdpvzwngwfddjhaismmoeykhawooo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455140.8084385-557-84182046588232/AnsiballZ_stat.py'
Jan 26 19:19:01 compute-0 sudo[101402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:01 compute-0 python3.9[101404]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:19:01 compute-0 sudo[101402]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:01 compute-0 sudo[101480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyhrwonpnwcgjobeexzzsywicsdhmmff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455140.8084385-557-84182046588232/AnsiballZ_file.py'
Jan 26 19:19:01 compute-0 sudo[101480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:01 compute-0 python3.9[101482]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:01 compute-0 sudo[101480]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:02 compute-0 sudo[101632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtmtofbmfweertiggtdtvzmschbsuaal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455141.9990127-581-77621624339566/AnsiballZ_stat.py'
Jan 26 19:19:02 compute-0 sudo[101632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:02 compute-0 python3.9[101634]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:19:02 compute-0 sudo[101632]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:02 compute-0 sudo[101710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqeneetjtdjsavaywuzwclcloqzfhdfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455141.9990127-581-77621624339566/AnsiballZ_file.py'
Jan 26 19:19:02 compute-0 sudo[101710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:03 compute-0 python3.9[101712]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:03 compute-0 sudo[101710]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:03 compute-0 sudo[101862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffcoynlvonxvipbxrtxbocwoiysgjoqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455143.295522-605-11247041827972/AnsiballZ_systemd.py'
Jan 26 19:19:03 compute-0 sudo[101862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:03 compute-0 python3.9[101864]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:19:04 compute-0 systemd[1]: Reloading.
Jan 26 19:19:04 compute-0 systemd-rc-local-generator[101888]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:19:04 compute-0 systemd-sysv-generator[101894]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:19:04 compute-0 systemd[1]: Starting Create netns directory...
Jan 26 19:19:04 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 26 19:19:04 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 26 19:19:04 compute-0 systemd[1]: Finished Create netns directory.
Jan 26 19:19:04 compute-0 sudo[101862]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:05 compute-0 sudo[102055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqywdyfyivfmengmrocgebobqoxyuruu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455144.6919024-625-26960220443990/AnsiballZ_file.py'
Jan 26 19:19:05 compute-0 sudo[102055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:05 compute-0 python3.9[102057]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:19:05 compute-0 sudo[102055]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:05 compute-0 sudo[102207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxpfxcnhynhsshidgzcydxmvwiaslwls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455145.4524398-641-177264319665416/AnsiballZ_stat.py'
Jan 26 19:19:05 compute-0 sudo[102207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:06 compute-0 python3.9[102209]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:19:06 compute-0 sudo[102207]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:06 compute-0 sudo[102330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuqupuiaayamuqmdnjxgtvupobrmiwvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455145.4524398-641-177264319665416/AnsiballZ_copy.py'
Jan 26 19:19:06 compute-0 sudo[102330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:06 compute-0 python3.9[102332]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455145.4524398-641-177264319665416/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:19:06 compute-0 sudo[102330]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:07 compute-0 sudo[102482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loqzwqytahtfdqwglqchyvtgtmbrrqtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455147.0708654-675-264067657383022/AnsiballZ_file.py'
Jan 26 19:19:07 compute-0 sudo[102482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:07 compute-0 python3.9[102484]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:07 compute-0 sudo[102482]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:08 compute-0 sudo[102634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndhouaprhherrpjkerdrrlbwizybytgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455147.9042459-691-169938625557641/AnsiballZ_file.py'
Jan 26 19:19:08 compute-0 sudo[102634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:08 compute-0 python3.9[102636]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:19:08 compute-0 sudo[102634]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:09 compute-0 sudo[102786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpbjzgrnkkzoeojjsklzeqeomwhflhno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455148.8338645-707-226878804568546/AnsiballZ_stat.py'
Jan 26 19:19:09 compute-0 sudo[102786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:09 compute-0 python3.9[102788]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:19:09 compute-0 sudo[102786]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:09 compute-0 sudo[102909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqnlipxuypbvqbagdtoaiowjjgsjnykp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455148.8338645-707-226878804568546/AnsiballZ_copy.py'
Jan 26 19:19:09 compute-0 sudo[102909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:10 compute-0 python3.9[102911]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455148.8338645-707-226878804568546/.source.json _original_basename=._1r98db_ follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:10 compute-0 sudo[102909]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:10 compute-0 python3.9[103061]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:13 compute-0 sudo[103482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqsqsbjylafjeukqsglwuumkzwxmsqth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455152.6719596-787-190791022340028/AnsiballZ_container_config_data.py'
Jan 26 19:19:13 compute-0 sudo[103482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:13 compute-0 python3.9[103484]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 26 19:19:13 compute-0 sudo[103482]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:14 compute-0 sudo[103634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-somdsbiqpcvupmbykaejhljngbxqazzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455153.786659-809-49035499011688/AnsiballZ_container_config_hash.py'
Jan 26 19:19:14 compute-0 sudo[103634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:14 compute-0 python3.9[103636]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 19:19:14 compute-0 sudo[103634]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:15 compute-0 sudo[103786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgcqjhuooeohoekcrlxexiwxzfraaecs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769455154.861459-829-18930259848172/AnsiballZ_edpm_container_manage.py'
Jan 26 19:19:15 compute-0 sudo[103786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:15 compute-0 python3[103788]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 19:19:15 compute-0 podman[103826]: 2026-01-26 19:19:15.97564209 +0000 UTC m=+0.084620784 container create c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 19:19:15 compute-0 podman[103826]: 2026-01-26 19:19:15.933583321 +0000 UTC m=+0.042562055 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 19:19:15 compute-0 python3[103788]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 19:19:16 compute-0 sudo[103786]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:16 compute-0 podman[103866]: 2026-01-26 19:19:16.374650987 +0000 UTC m=+0.129681894 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Jan 26 19:19:17 compute-0 sudo[104041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akvbbduqajbocoqkltnvslpkrklrjqfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455157.056781-845-209278002284172/AnsiballZ_stat.py'
Jan 26 19:19:17 compute-0 sudo[104041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:17 compute-0 python3.9[104043]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:19:17 compute-0 sudo[104041]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:18 compute-0 sudo[104195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osvtzjfwdbudshzbnwtitdbnvrghjyxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455157.8774323-863-1097976166086/AnsiballZ_file.py'
Jan 26 19:19:18 compute-0 sudo[104195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:18 compute-0 python3.9[104197]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:18 compute-0 sudo[104195]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:18 compute-0 sudo[104271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pipbuijwfdmohgmjhvroaqueaiyiorgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455157.8774323-863-1097976166086/AnsiballZ_stat.py'
Jan 26 19:19:18 compute-0 sudo[104271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:18 compute-0 python3.9[104273]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:19:18 compute-0 sudo[104271]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:19 compute-0 sudo[104422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbpzadrbbllszywogmvawwnhpsnmdupw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455158.9736335-863-188239781102172/AnsiballZ_copy.py'
Jan 26 19:19:19 compute-0 sudo[104422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:19 compute-0 python3.9[104424]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769455158.9736335-863-188239781102172/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:19 compute-0 sudo[104422]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:20 compute-0 sudo[104498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrerwxnzqppjvsdkyakficvfrxvcques ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455158.9736335-863-188239781102172/AnsiballZ_systemd.py'
Jan 26 19:19:20 compute-0 sudo[104498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:20 compute-0 python3.9[104500]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 19:19:20 compute-0 systemd[1]: Reloading.
Jan 26 19:19:20 compute-0 systemd-rc-local-generator[104527]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:19:20 compute-0 systemd-sysv-generator[104530]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:19:20 compute-0 sudo[104498]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:20 compute-0 sudo[104609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rctrgklibdxlwkyzapvumnwvmqjypzak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455158.9736335-863-188239781102172/AnsiballZ_systemd.py'
Jan 26 19:19:21 compute-0 sudo[104609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:21 compute-0 python3.9[104611]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:19:21 compute-0 systemd[1]: Reloading.
Jan 26 19:19:21 compute-0 systemd-sysv-generator[104644]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:19:21 compute-0 systemd-rc-local-generator[104639]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:19:21 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Jan 26 19:19:21 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:19:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82fc293a9c9e8fee5eccc759caed9f9b171f2b3a2bcd833861c05a462ebeae81/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 26 19:19:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82fc293a9c9e8fee5eccc759caed9f9b171f2b3a2bcd833861c05a462ebeae81/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 19:19:21 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6.
Jan 26 19:19:21 compute-0 podman[104652]: 2026-01-26 19:19:21.801698456 +0000 UTC m=+0.167685513 container init c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: + sudo -E kolla_set_configs
Jan 26 19:19:21 compute-0 podman[104652]: 2026-01-26 19:19:21.839458889 +0000 UTC m=+0.205445916 container start c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 19:19:21 compute-0 edpm-start-podman-container[104652]: ovn_metadata_agent
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Validating config file
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Copying service configuration files
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Writing out command to execute
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: ++ cat /run_command
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: + CMD=neutron-ovn-metadata-agent
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: + ARGS=
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: + sudo kolla_copy_cacerts
Jan 26 19:19:21 compute-0 podman[104674]: 2026-01-26 19:19:21.91998698 +0000 UTC m=+0.068473186 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 19:19:21 compute-0 edpm-start-podman-container[104651]: Creating additional drop-in dependency for "ovn_metadata_agent" (c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6)
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: + [[ ! -n '' ]]
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: + . kolla_extend_start
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: Running command: 'neutron-ovn-metadata-agent'
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: + umask 0022
Jan 26 19:19:21 compute-0 ovn_metadata_agent[104667]: + exec neutron-ovn-metadata-agent
Jan 26 19:19:21 compute-0 systemd[1]: Reloading.
Jan 26 19:19:22 compute-0 systemd-rc-local-generator[104744]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:19:22 compute-0 systemd-sysv-generator[104749]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:19:22 compute-0 systemd[1]: Started ovn_metadata_agent container.
Jan 26 19:19:22 compute-0 sudo[104609]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.944 104672 INFO neutron.common.config [-] Logging enabled!
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.945 104672 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.945 104672 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.946 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.946 104672 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.946 104672 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.946 104672 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.946 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.946 104672 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.946 104672 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.946 104672 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.946 104672 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.947 104672 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.947 104672 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.947 104672 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.947 104672 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.947 104672 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.947 104672 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.947 104672 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.947 104672 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.947 104672 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.947 104672 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.948 104672 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.948 104672 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.948 104672 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.948 104672 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.948 104672 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.948 104672 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.948 104672 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.948 104672 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.948 104672 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.948 104672 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.949 104672 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.949 104672 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.949 104672 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.949 104672 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.949 104672 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.949 104672 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.949 104672 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.949 104672 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.949 104672 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.949 104672 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.950 104672 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.950 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.950 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.950 104672 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.950 104672 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.950 104672 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.950 104672 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.950 104672 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.950 104672 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.950 104672 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.950 104672 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.951 104672 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.951 104672 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.951 104672 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.951 104672 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.951 104672 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.951 104672 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.951 104672 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.951 104672 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.951 104672 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.951 104672 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.951 104672 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.952 104672 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.952 104672 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.952 104672 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.952 104672 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.952 104672 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.952 104672 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.952 104672 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.58 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.952 104672 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.952 104672 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.952 104672 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.952 104672 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.953 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.953 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.953 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.953 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.953 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.953 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.953 104672 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.953 104672 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.954 104672 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.954 104672 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.954 104672 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.954 104672 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.954 104672 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.954 104672 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.954 104672 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.954 104672 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.954 104672 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.954 104672 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.954 104672 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.955 104672 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.955 104672 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.955 104672 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.955 104672 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.955 104672 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.955 104672 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.955 104672 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.955 104672 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.955 104672 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.955 104672 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.955 104672 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.956 104672 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.956 104672 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.956 104672 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.956 104672 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.956 104672 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.956 104672 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.956 104672 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.956 104672 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.956 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.956 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.956 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.957 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.957 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.957 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.957 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.957 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.957 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.957 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.957 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.957 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.957 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.958 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.958 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.958 104672 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.958 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.958 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.958 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.958 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.958 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.958 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.958 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.959 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.959 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.959 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.959 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.959 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.959 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.959 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.959 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.959 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.959 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.960 104672 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.960 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.960 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.960 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.960 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.960 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.960 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.960 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.960 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.960 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.960 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.961 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.961 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.961 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.961 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.961 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.961 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.961 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.961 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.961 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.961 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.961 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.962 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.962 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.962 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.962 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.962 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.962 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.962 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.962 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.962 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.962 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.962 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.963 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.963 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.963 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.963 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.963 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.963 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.963 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.963 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.963 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.963 104672 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.963 104672 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.964 104672 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.964 104672 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.964 104672 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.964 104672 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.964 104672 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.964 104672 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.964 104672 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.964 104672 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.964 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.964 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.964 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.965 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.965 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.965 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.965 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.965 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.965 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.965 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.965 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.965 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.965 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.966 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.966 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.966 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.966 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.966 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.966 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.966 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.966 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.966 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.966 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.967 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.967 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.967 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.967 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.967 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.967 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.967 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.967 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.967 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.968 104672 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.968 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.968 104672 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.968 104672 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.968 104672 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.968 104672 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.968 104672 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.968 104672 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.968 104672 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.968 104672 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.969 104672 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.969 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.969 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.969 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.969 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.969 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.969 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.969 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.969 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.969 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.969 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.970 104672 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.970 104672 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.970 104672 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.970 104672 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.970 104672 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.970 104672 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.970 104672 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.970 104672 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.970 104672 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.970 104672 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.970 104672 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.971 104672 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.971 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.971 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.971 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.971 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.971 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.971 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.971 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.971 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.971 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.971 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.972 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.972 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.972 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.972 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.972 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.972 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.972 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.972 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.972 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.972 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.972 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.973 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.973 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.973 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.973 104672 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.973 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.973 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.973 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.973 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.973 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.973 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.973 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.974 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.974 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.974 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.974 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.974 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.974 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.974 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.974 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.974 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.974 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.975 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.975 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.975 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.975 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.975 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.975 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.975 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.975 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.975 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.975 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.976 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.976 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.976 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.976 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.976 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.976 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.976 104672 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.976 104672 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.976 104672 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.976 104672 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.976 104672 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.977 104672 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.977 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.977 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.977 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.977 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.977 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.977 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.977 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.977 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.977 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.977 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.978 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.978 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.978 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.978 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.978 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.978 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.978 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.978 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.978 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.978 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.979 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.979 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.979 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.979 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.979 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.979 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.979 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.979 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.979 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.979 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.979 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.980 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.980 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.980 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.980 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.980 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.980 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.980 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.980 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.980 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.980 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.981 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.981 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.981 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.981 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.981 104672 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.981 104672 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.989 104672 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.989 104672 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.990 104672 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.990 104672 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.990 104672 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 26 19:19:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:23.999 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 4b7fe4ab-0aa1-433c-a7da-fec1fee5732c (UUID: 4b7fe4ab-0aa1-433c-a7da-fec1fee5732c) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.021 104672 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.021 104672 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.021 104672 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.021 104672 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.021 104672 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.024 104672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.027 104672 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.033 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '4b7fe4ab-0aa1-433c-a7da-fec1fee5732c'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], external_ids={}, name=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, nb_cfg_timestamp=1769455104959, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.036 104672 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp63w58w22/privsep.sock']
Jan 26 19:19:24 compute-0 python3.9[104915]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 19:19:24 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.803 104672 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.804 104672 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp63w58w22/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.644 104941 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.648 104941 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.652 104941 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.652 104941 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104941
Jan 26 19:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:24.807 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[5456f82b-af6e-4188-b03d-41d5054f55f2]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:19:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:25.254 104941 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:19:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:25.254 104941 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:19:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:25.254 104941 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:19:25 compute-0 sudo[105071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpzjkocfotcbitbomwhguejgnnqnikqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455164.989538-953-21065011331323/AnsiballZ_stat.py'
Jan 26 19:19:25 compute-0 sudo[105071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:25 compute-0 python3.9[105073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:19:25 compute-0 sudo[105071]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:25.725 104941 INFO oslo_service.backend [-] Loading backend: eventlet
Jan 26 19:19:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:25.731 104941 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Jan 26 19:19:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:25.772 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[ed8e7f98-367c-4b32-b281-5cc1e89da869]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:19:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:25.775 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, column=external_ids, values=({'neutron:ovn-metadata-id': 'a1e36cd5-fd8c-5653-8b5e-f6d402138398'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:19:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:25.787 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:19:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:19:25.793 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:19:26 compute-0 sudo[105196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpmidmjxnqhbeiuqrnqebdwkvqunnqxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455164.989538-953-21065011331323/AnsiballZ_copy.py'
Jan 26 19:19:26 compute-0 sudo[105196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:26 compute-0 python3.9[105198]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455164.989538-953-21065011331323/.source.yaml _original_basename=.2bkiv02g follow=False checksum=7f018bde372e350f7d68d05d26060ce5ea040076 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:26 compute-0 sudo[105196]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:26 compute-0 sshd-session[96424]: Connection closed by 192.168.122.30 port 33586
Jan 26 19:19:26 compute-0 sshd-session[96421]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:19:26 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Jan 26 19:19:26 compute-0 systemd[1]: session-22.scope: Consumed 41.693s CPU time.
Jan 26 19:19:26 compute-0 systemd-logind[794]: Session 22 logged out. Waiting for processes to exit.
Jan 26 19:19:26 compute-0 systemd-logind[794]: Removed session 22.
Jan 26 19:19:32 compute-0 sshd-session[105224]: Accepted publickey for zuul from 192.168.122.30 port 57980 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:19:32 compute-0 systemd-logind[794]: New session 23 of user zuul.
Jan 26 19:19:32 compute-0 systemd[1]: Started Session 23 of User zuul.
Jan 26 19:19:32 compute-0 sshd-session[105224]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:19:33 compute-0 python3.9[105377]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:19:35 compute-0 sudo[105531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzwbkfwhvvnvhxebkhvssifbucgdwvap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455174.649824-43-185320545128482/AnsiballZ_command.py'
Jan 26 19:19:35 compute-0 sudo[105531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:35 compute-0 python3.9[105533]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:19:35 compute-0 sudo[105531]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:36 compute-0 sudo[105696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-romsgfsggcdxpulaszblopnpozckcngz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455175.8780758-65-268137830676072/AnsiballZ_systemd_service.py'
Jan 26 19:19:36 compute-0 sudo[105696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:36 compute-0 python3.9[105698]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 19:19:36 compute-0 systemd[1]: Reloading.
Jan 26 19:19:36 compute-0 systemd-sysv-generator[105726]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:19:36 compute-0 systemd-rc-local-generator[105721]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:19:37 compute-0 sudo[105696]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:38 compute-0 python3.9[105882]: ansible-ansible.builtin.service_facts Invoked
Jan 26 19:19:38 compute-0 network[105899]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 19:19:38 compute-0 network[105900]: 'network-scripts' will be removed from distribution in near future.
Jan 26 19:19:38 compute-0 network[105901]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 19:19:41 compute-0 sshd-session[105987]: Invalid user loginuser from 193.32.162.151 port 50140
Jan 26 19:19:41 compute-0 sshd-session[105987]: Connection closed by invalid user loginuser 193.32.162.151 port 50140 [preauth]
Jan 26 19:19:43 compute-0 sudo[106162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfoyfrkfrrdeusiuehvivuuuwyfurhwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455182.992044-103-251508772152688/AnsiballZ_systemd_service.py'
Jan 26 19:19:43 compute-0 sudo[106162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:43 compute-0 python3.9[106164]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:19:43 compute-0 sudo[106162]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:44 compute-0 sudo[106315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ersnxhpmncycmzayihkoclbegvujrnce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455183.887934-103-196194372978193/AnsiballZ_systemd_service.py'
Jan 26 19:19:44 compute-0 sudo[106315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:44 compute-0 python3.9[106317]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:19:44 compute-0 sudo[106315]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:45 compute-0 sudo[106468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqybfvgohgjplpuocrjpnwedahnenkgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455184.8398063-103-245950895578588/AnsiballZ_systemd_service.py'
Jan 26 19:19:45 compute-0 sudo[106468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:45 compute-0 python3.9[106470]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:19:45 compute-0 sudo[106468]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:46 compute-0 sudo[106621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbtdtqdbbwpnnabmxtxxkyupsbfzubuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455185.7341468-103-96242487706219/AnsiballZ_systemd_service.py'
Jan 26 19:19:46 compute-0 sudo[106621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:46 compute-0 python3.9[106623]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:19:46 compute-0 sudo[106621]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:46 compute-0 podman[106625]: 2026-01-26 19:19:46.682583966 +0000 UTC m=+0.132655474 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:19:47 compute-0 sudo[106800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqrjjiruilkdapogmomoikkheutikmob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455186.767947-103-127945855934247/AnsiballZ_systemd_service.py'
Jan 26 19:19:47 compute-0 sudo[106800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:47 compute-0 python3.9[106802]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:19:47 compute-0 sudo[106800]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:48 compute-0 sudo[106953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezsyqezsdumkewzurniubbjarjbsjinb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455187.751534-103-197638031330922/AnsiballZ_systemd_service.py'
Jan 26 19:19:48 compute-0 sudo[106953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:48 compute-0 python3.9[106955]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:19:48 compute-0 sudo[106953]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:48 compute-0 sudo[107106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvrybtbkerqtgtbvecfccdsyvvkcinpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455188.6080372-103-249728440189409/AnsiballZ_systemd_service.py'
Jan 26 19:19:48 compute-0 sudo[107106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:49 compute-0 python3.9[107108]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:19:49 compute-0 sudo[107106]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:50 compute-0 sudo[107259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrfnuleeczdzgkaefanamuzgcuaualnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455189.7205408-207-214729792089376/AnsiballZ_file.py'
Jan 26 19:19:50 compute-0 sudo[107259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:50 compute-0 python3.9[107261]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:50 compute-0 sudo[107259]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:51 compute-0 sudo[107411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wohszbltozgipxyjapuwnpalkwilztke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455190.6352494-207-169551456046091/AnsiballZ_file.py'
Jan 26 19:19:51 compute-0 sudo[107411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:51 compute-0 python3.9[107413]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:51 compute-0 sudo[107411]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:51 compute-0 sudo[107563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpbmnkejupxnvbjyjccecwghmeocnvdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455191.3730469-207-47233356458946/AnsiballZ_file.py'
Jan 26 19:19:51 compute-0 sudo[107563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:51 compute-0 python3.9[107565]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:51 compute-0 sudo[107563]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:52 compute-0 podman[107665]: 2026-01-26 19:19:52.313358182 +0000 UTC m=+0.065851885 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 19:19:52 compute-0 sudo[107735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrbgmhqloovfdczgulbiuglcgxffdcrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455192.0703702-207-209843725101889/AnsiballZ_file.py'
Jan 26 19:19:52 compute-0 sudo[107735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:52 compute-0 python3.9[107737]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:52 compute-0 sudo[107735]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:53 compute-0 sudo[107887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iumgvunutbsotaliowiumrupapiqyvba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455192.841513-207-46642217350058/AnsiballZ_file.py'
Jan 26 19:19:53 compute-0 sudo[107887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:53 compute-0 python3.9[107889]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:53 compute-0 sudo[107887]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:53 compute-0 sudo[108039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmzgotfzrxfzgqqymgtdumbenbbffbnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455193.6133447-207-193859892667502/AnsiballZ_file.py'
Jan 26 19:19:53 compute-0 sudo[108039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:54 compute-0 python3.9[108041]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:54 compute-0 sudo[108039]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:54 compute-0 sudo[108191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bndzwmgbhxxoptkdjiocjhzvntmgpylv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455194.2798193-207-110644832010038/AnsiballZ_file.py'
Jan 26 19:19:54 compute-0 sudo[108191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:54 compute-0 python3.9[108193]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:54 compute-0 sudo[108191]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:55 compute-0 sudo[108343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eleyrxajriwtjnmnhrrrfrkmdcfkyehq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455195.1591544-307-173490608603205/AnsiballZ_file.py'
Jan 26 19:19:55 compute-0 sudo[108343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:55 compute-0 python3.9[108345]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:55 compute-0 sudo[108343]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:56 compute-0 sudo[108495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwyullbnmdqllphbnrmxqoiqnrxvqner ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455195.9308302-307-36645240561650/AnsiballZ_file.py'
Jan 26 19:19:56 compute-0 sudo[108495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:56 compute-0 python3.9[108497]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:56 compute-0 sudo[108495]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:56 compute-0 sudo[108647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzsfcnlsyuacfhoelyhanshlhmrnuyjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455196.5645888-307-240565284618737/AnsiballZ_file.py'
Jan 26 19:19:56 compute-0 sudo[108647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:57 compute-0 python3.9[108649]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:57 compute-0 sudo[108647]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:57 compute-0 sudo[108799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvxtbhckxkuyguqwpsqoivicqtclxtnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455197.1830359-307-275324518742440/AnsiballZ_file.py'
Jan 26 19:19:57 compute-0 sudo[108799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:57 compute-0 python3.9[108801]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:57 compute-0 sudo[108799]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:58 compute-0 sudo[108951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqbnetijtxkkvmzgozxybdjriebckbly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455197.8455787-307-89370748548307/AnsiballZ_file.py'
Jan 26 19:19:58 compute-0 sudo[108951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:58 compute-0 python3.9[108953]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:58 compute-0 sudo[108951]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:58 compute-0 sudo[109103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eapcucfdftqquxuennxqubfywmymuzoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455198.5446024-307-250842577973732/AnsiballZ_file.py'
Jan 26 19:19:58 compute-0 sudo[109103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:58 compute-0 python3.9[109105]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:59 compute-0 sudo[109103]: pam_unix(sudo:session): session closed for user root
Jan 26 19:19:59 compute-0 sudo[109255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxhiyvlaxmtjbyakisadknawixpegqmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455199.1619613-307-51070399261128/AnsiballZ_file.py'
Jan 26 19:19:59 compute-0 sudo[109255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:19:59 compute-0 python3.9[109257]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:19:59 compute-0 sudo[109255]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:00 compute-0 sudo[109407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctqueqnpxnahyfdtqbaxuqwfybqtjagw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455200.041509-409-160256991978656/AnsiballZ_command.py'
Jan 26 19:20:00 compute-0 sudo[109407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:00 compute-0 python3.9[109409]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:20:00 compute-0 sudo[109407]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:01 compute-0 python3.9[109561]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 19:20:02 compute-0 sudo[109711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqxowihztyuzgyqliotukyfqlncttsdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455201.8443067-445-236012133736555/AnsiballZ_systemd_service.py'
Jan 26 19:20:02 compute-0 sudo[109711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:02 compute-0 python3.9[109713]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 19:20:02 compute-0 systemd[1]: Reloading.
Jan 26 19:20:02 compute-0 systemd-rc-local-generator[109737]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:20:02 compute-0 systemd-sysv-generator[109743]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:20:02 compute-0 sudo[109711]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:03 compute-0 sudo[109898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eehiordcmabezfmlwrofqzxmapjvprin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455203.0988958-461-217874295020485/AnsiballZ_command.py'
Jan 26 19:20:03 compute-0 sudo[109898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:03 compute-0 python3.9[109900]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:20:03 compute-0 sudo[109898]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:04 compute-0 sudo[110051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrtwqerfbhmbuukihxoknezubarjwnwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455203.871192-461-116207728713382/AnsiballZ_command.py'
Jan 26 19:20:04 compute-0 sudo[110051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:04 compute-0 python3.9[110053]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:20:04 compute-0 sudo[110051]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:05 compute-0 sudo[110204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujvvqiuowuczgnrpseeturwycpxcwfin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455204.6609755-461-60269114886075/AnsiballZ_command.py'
Jan 26 19:20:05 compute-0 sudo[110204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:05 compute-0 python3.9[110206]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:20:05 compute-0 sudo[110204]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:05 compute-0 sudo[110357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhvikcdmtovbnstpigakfdgtyejbdiym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455205.492938-461-199790151262445/AnsiballZ_command.py'
Jan 26 19:20:05 compute-0 sudo[110357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:06 compute-0 python3.9[110359]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:20:06 compute-0 sudo[110357]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:06 compute-0 sudo[110510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnyugojxzswlddqiymeitvxrbvfdtdyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455206.2361815-461-102089998170781/AnsiballZ_command.py'
Jan 26 19:20:06 compute-0 sudo[110510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:06 compute-0 python3.9[110512]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:20:06 compute-0 sudo[110510]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:07 compute-0 sudo[110663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubuivjgvhkhkwqwcfmkyfmukqdctfgeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455207.0346074-461-23622652413121/AnsiballZ_command.py'
Jan 26 19:20:07 compute-0 sudo[110663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:07 compute-0 python3.9[110665]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:20:07 compute-0 sudo[110663]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:08 compute-0 sudo[110816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnbzcmyoaqalpwlwqkwgvgvbehquvhqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455207.8357942-461-26999572207497/AnsiballZ_command.py'
Jan 26 19:20:08 compute-0 sudo[110816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:08 compute-0 python3.9[110818]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:20:08 compute-0 sudo[110816]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:09 compute-0 sudo[110969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpbujmvobgmoxxybtirueetviirfbiuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455209.017269-569-198802684983694/AnsiballZ_getent.py'
Jan 26 19:20:09 compute-0 sudo[110969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:09 compute-0 python3.9[110971]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 26 19:20:09 compute-0 sudo[110969]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:10 compute-0 sudo[111122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkairckfgwrjuzjhttmgwvivqznhtsqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455209.985995-585-43803447639830/AnsiballZ_group.py'
Jan 26 19:20:10 compute-0 sudo[111122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:10 compute-0 python3.9[111124]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 19:20:10 compute-0 groupadd[111125]: group added to /etc/group: name=libvirt, GID=42473
Jan 26 19:20:10 compute-0 groupadd[111125]: group added to /etc/gshadow: name=libvirt
Jan 26 19:20:10 compute-0 groupadd[111125]: new group: name=libvirt, GID=42473
Jan 26 19:20:10 compute-0 sudo[111122]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:11 compute-0 sudo[111280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krhxreezanwgepaqsnliomzqhrfvlyub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455211.012735-601-171144630992743/AnsiballZ_user.py'
Jan 26 19:20:11 compute-0 sudo[111280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:11 compute-0 python3.9[111282]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 19:20:11 compute-0 useradd[111284]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 19:20:11 compute-0 sudo[111280]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:12 compute-0 sudo[111440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-movvhpnrwbmljiqwthrifoqpvdyixvpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455212.2792368-623-28078254604097/AnsiballZ_setup.py'
Jan 26 19:20:12 compute-0 sudo[111440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:12 compute-0 python3.9[111442]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:20:13 compute-0 sudo[111440]: pam_unix(sudo:session): session closed for user root
Jan 26 19:20:13 compute-0 sudo[111524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljhjkrxliwdqccwcvnctnocwkwkjfpva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455212.2792368-623-28078254604097/AnsiballZ_dnf.py'
Jan 26 19:20:13 compute-0 sudo[111524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:20:13 compute-0 python3.9[111526]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:20:17 compute-0 podman[111534]: 2026-01-26 19:20:17.432528444 +0000 UTC m=+0.167907571 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller)
Jan 26 19:20:23 compute-0 podman[111629]: 2026-01-26 19:20:23.328295114 +0000 UTC m=+0.067730664 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS)
Jan 26 19:20:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:20:23.983 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:20:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:20:23.984 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:20:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:20:23.984 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:20:41 compute-0 kernel: SELinux:  Converting 2764 SID table entries...
Jan 26 19:20:41 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 19:20:41 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 26 19:20:41 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 19:20:41 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 26 19:20:41 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 19:20:41 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 19:20:41 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 19:20:48 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 26 19:20:48 compute-0 podman[111769]: 2026-01-26 19:20:48.401375302 +0000 UTC m=+0.136414793 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 19:20:50 compute-0 kernel: SELinux:  Converting 2764 SID table entries...
Jan 26 19:20:50 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 19:20:50 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 26 19:20:50 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 19:20:50 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 26 19:20:50 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 19:20:50 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 19:20:50 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 19:20:54 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 26 19:20:54 compute-0 podman[111803]: 2026-01-26 19:20:54.374786164 +0000 UTC m=+0.099982412 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 19:21:19 compute-0 podman[120305]: 2026-01-26 19:21:19.411058494 +0000 UTC m=+0.150788946 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 19:21:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:21:23.985 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:21:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:21:23.985 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:21:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:21:23.985 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:21:25 compute-0 podman[123015]: 2026-01-26 19:21:25.345367961 +0000 UTC m=+0.078369289 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Jan 26 19:21:49 compute-0 sshd-session[128736]: Invalid user loginuser from 193.32.162.151 port 55740
Jan 26 19:21:49 compute-0 sshd-session[128736]: Connection closed by invalid user loginuser 193.32.162.151 port 55740 [preauth]
Jan 26 19:21:50 compute-0 podman[128742]: 2026-01-26 19:21:50.38577233 +0000 UTC m=+0.131435680 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260120)
Jan 26 19:21:51 compute-0 kernel: SELinux:  Converting 2765 SID table entries...
Jan 26 19:21:51 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 26 19:21:51 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 26 19:21:51 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 26 19:21:51 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 26 19:21:51 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 26 19:21:51 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 26 19:21:51 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 26 19:21:52 compute-0 groupadd[128776]: group added to /etc/group: name=dnsmasq, GID=993
Jan 26 19:21:52 compute-0 groupadd[128776]: group added to /etc/gshadow: name=dnsmasq
Jan 26 19:21:52 compute-0 groupadd[128776]: new group: name=dnsmasq, GID=993
Jan 26 19:21:52 compute-0 useradd[128783]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 26 19:21:52 compute-0 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 26 19:21:52 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 26 19:21:52 compute-0 dbus-broker-launch[754]: Noticed file-system modification, trigger reload.
Jan 26 19:21:53 compute-0 groupadd[128796]: group added to /etc/group: name=clevis, GID=992
Jan 26 19:21:53 compute-0 groupadd[128796]: group added to /etc/gshadow: name=clevis
Jan 26 19:21:53 compute-0 groupadd[128796]: new group: name=clevis, GID=992
Jan 26 19:21:53 compute-0 useradd[128803]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 26 19:21:53 compute-0 usermod[128813]: add 'clevis' to group 'tss'
Jan 26 19:21:53 compute-0 usermod[128813]: add 'clevis' to shadow group 'tss'
Jan 26 19:21:55 compute-0 polkitd[43651]: Reloading rules
Jan 26 19:21:55 compute-0 polkitd[43651]: Collecting garbage unconditionally...
Jan 26 19:21:55 compute-0 polkitd[43651]: Loading rules from directory /etc/polkit-1/rules.d
Jan 26 19:21:55 compute-0 polkitd[43651]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 26 19:21:55 compute-0 polkitd[43651]: Finished loading, compiling and executing 3 rules
Jan 26 19:21:55 compute-0 polkitd[43651]: Reloading rules
Jan 26 19:21:55 compute-0 polkitd[43651]: Collecting garbage unconditionally...
Jan 26 19:21:55 compute-0 polkitd[43651]: Loading rules from directory /etc/polkit-1/rules.d
Jan 26 19:21:55 compute-0 polkitd[43651]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 26 19:21:55 compute-0 polkitd[43651]: Finished loading, compiling and executing 3 rules
Jan 26 19:21:55 compute-0 podman[128841]: 2026-01-26 19:21:55.963539159 +0000 UTC m=+0.080072285 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 19:21:57 compute-0 groupadd[129022]: group added to /etc/group: name=ceph, GID=167
Jan 26 19:21:57 compute-0 groupadd[129022]: group added to /etc/gshadow: name=ceph
Jan 26 19:21:57 compute-0 groupadd[129022]: new group: name=ceph, GID=167
Jan 26 19:21:57 compute-0 useradd[129028]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 26 19:22:00 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Jan 26 19:22:00 compute-0 sshd[1002]: Received signal 15; terminating.
Jan 26 19:22:00 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Jan 26 19:22:00 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Jan 26 19:22:00 compute-0 systemd[1]: sshd.service: Consumed 1.677s CPU time, read 32.0K from disk, written 8.0K to disk.
Jan 26 19:22:00 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Jan 26 19:22:00 compute-0 systemd[1]: Stopping sshd-keygen.target...
Jan 26 19:22:00 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 19:22:00 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 19:22:00 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 26 19:22:00 compute-0 systemd[1]: Reached target sshd-keygen.target.
Jan 26 19:22:00 compute-0 systemd[1]: Starting OpenSSH server daemon...
Jan 26 19:22:00 compute-0 sshd[129547]: Server listening on 0.0.0.0 port 22.
Jan 26 19:22:00 compute-0 sshd[129547]: Server listening on :: port 22.
Jan 26 19:22:00 compute-0 systemd[1]: Started OpenSSH server daemon.
Jan 26 19:22:03 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 19:22:03 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 26 19:22:03 compute-0 systemd[1]: Reloading.
Jan 26 19:22:03 compute-0 systemd-sysv-generator[129812]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:22:03 compute-0 systemd-rc-local-generator[129807]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:22:03 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 19:22:06 compute-0 sudo[111524]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:13 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 19:22:13 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 26 19:22:13 compute-0 systemd[1]: man-db-cache-update.service: Consumed 13.152s CPU time.
Jan 26 19:22:13 compute-0 systemd[1]: run-r2e18bdc02c2545c5bc385126f83bd26e.service: Deactivated successfully.
Jan 26 19:22:21 compute-0 podman[138208]: 2026-01-26 19:22:21.413335936 +0000 UTC m=+0.152042463 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20260120)
Jan 26 19:22:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:22:23.987 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:22:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:22:23.988 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:22:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:22:23.988 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:22:26 compute-0 podman[138235]: 2026-01-26 19:22:26.315061405 +0000 UTC m=+0.061939735 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 19:22:33 compute-0 sudo[138380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-recycqjufhwnewdqevvfcgmfdabsbamy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455352.392715-647-102948168067293/AnsiballZ_systemd.py'
Jan 26 19:22:33 compute-0 sudo[138380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:33 compute-0 python3.9[138382]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 19:22:33 compute-0 systemd[1]: Reloading.
Jan 26 19:22:33 compute-0 systemd-sysv-generator[138415]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:22:33 compute-0 systemd-rc-local-generator[138410]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:22:33 compute-0 sudo[138380]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:34 compute-0 sudo[138569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfrcqrlquddxhszswxjqbeqyjrdccrue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455354.3470635-647-65747797002206/AnsiballZ_systemd.py'
Jan 26 19:22:34 compute-0 sudo[138569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:35 compute-0 python3.9[138571]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 19:22:35 compute-0 systemd[1]: Reloading.
Jan 26 19:22:35 compute-0 systemd-rc-local-generator[138601]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:22:35 compute-0 systemd-sysv-generator[138604]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:22:35 compute-0 sudo[138569]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:35 compute-0 sudo[138759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krpvarfmfqapefkandyifsypfvroixes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455355.606446-647-89183058841709/AnsiballZ_systemd.py'
Jan 26 19:22:35 compute-0 sudo[138759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:36 compute-0 python3.9[138761]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 19:22:36 compute-0 systemd[1]: Reloading.
Jan 26 19:22:36 compute-0 systemd-sysv-generator[138795]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:22:36 compute-0 systemd-rc-local-generator[138792]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:22:36 compute-0 sudo[138759]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:37 compute-0 sudo[138950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-debehaynrjyjjojbburcktwjqmbuvqsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455356.9317982-647-239257595832156/AnsiballZ_systemd.py'
Jan 26 19:22:37 compute-0 sudo[138950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:37 compute-0 python3.9[138952]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 19:22:37 compute-0 systemd[1]: Reloading.
Jan 26 19:22:37 compute-0 systemd-rc-local-generator[138982]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:22:37 compute-0 systemd-sysv-generator[138985]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:22:38 compute-0 sudo[138950]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:38 compute-0 sudo[139141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdfwqniymichyuplqjenjfpgbdcdasii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455358.4671779-705-61827433898832/AnsiballZ_systemd.py'
Jan 26 19:22:38 compute-0 sudo[139141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:39 compute-0 python3.9[139143]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:39 compute-0 systemd[1]: Reloading.
Jan 26 19:22:39 compute-0 systemd-rc-local-generator[139171]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:22:39 compute-0 systemd-sysv-generator[139176]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:22:39 compute-0 sudo[139141]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:40 compute-0 sudo[139331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyohynozrblmgprbiaszyyxnctykinsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455359.852594-705-198205141897180/AnsiballZ_systemd.py'
Jan 26 19:22:40 compute-0 sudo[139331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:40 compute-0 python3.9[139333]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:40 compute-0 systemd[1]: Reloading.
Jan 26 19:22:40 compute-0 systemd-sysv-generator[139366]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:22:40 compute-0 systemd-rc-local-generator[139361]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:22:40 compute-0 sudo[139331]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:41 compute-0 sudo[139521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsimdvnrbgcztkrgiqqmopkykueosgvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455361.1272964-705-2466397077489/AnsiballZ_systemd.py'
Jan 26 19:22:41 compute-0 sudo[139521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:41 compute-0 python3.9[139523]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:41 compute-0 systemd[1]: Reloading.
Jan 26 19:22:41 compute-0 systemd-sysv-generator[139557]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:22:41 compute-0 systemd-rc-local-generator[139553]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:22:42 compute-0 sudo[139521]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:42 compute-0 sudo[139711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptnnqziroxujkknnlcmdimqgemzazedv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455362.327643-705-246776799146636/AnsiballZ_systemd.py'
Jan 26 19:22:42 compute-0 sudo[139711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:42 compute-0 python3.9[139713]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:43 compute-0 sudo[139711]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:43 compute-0 sudo[139866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjronzjmnwvmcsagzxczuxkxeruirdux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455363.220733-705-256009063111402/AnsiballZ_systemd.py'
Jan 26 19:22:43 compute-0 sudo[139866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:43 compute-0 python3.9[139868]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:43 compute-0 systemd[1]: Reloading.
Jan 26 19:22:44 compute-0 systemd-rc-local-generator[139899]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:22:44 compute-0 systemd-sysv-generator[139902]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:22:44 compute-0 sudo[139866]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:44 compute-0 sudo[140055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwgrjybbzklbnubtgjrneymoxjiyukhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455364.4545543-777-112616854238093/AnsiballZ_systemd.py'
Jan 26 19:22:44 compute-0 sudo[140055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:45 compute-0 python3.9[140057]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 26 19:22:45 compute-0 systemd[1]: Reloading.
Jan 26 19:22:45 compute-0 systemd-rc-local-generator[140085]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:22:45 compute-0 systemd-sysv-generator[140088]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:22:45 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 26 19:22:45 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 26 19:22:45 compute-0 sudo[140055]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:46 compute-0 sudo[140248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pblwskdgpybhddpcjtjfjvoaavtmnqiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455365.789951-793-225020534674207/AnsiballZ_systemd.py'
Jan 26 19:22:46 compute-0 sudo[140248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:46 compute-0 python3.9[140250]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:46 compute-0 sudo[140248]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:47 compute-0 sudo[140403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elnrcffrtaweugatmfisxrkfhtjfmdzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455366.7987752-793-187132290264089/AnsiballZ_systemd.py'
Jan 26 19:22:47 compute-0 sudo[140403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:47 compute-0 python3.9[140405]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:47 compute-0 sudo[140403]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:48 compute-0 sudo[140558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytdzgrvjxdtgozavqllkhbxqruhkldkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455367.6969213-793-142375773648518/AnsiballZ_systemd.py'
Jan 26 19:22:48 compute-0 sudo[140558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:48 compute-0 python3.9[140560]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:49 compute-0 sudo[140558]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:49 compute-0 sudo[140713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhunudfyoovmrimnxoaqzpkszpskzesg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455369.561429-793-25365955762078/AnsiballZ_systemd.py'
Jan 26 19:22:49 compute-0 sudo[140713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:50 compute-0 python3.9[140715]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:50 compute-0 sudo[140713]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:50 compute-0 sudo[140868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddmywfsckrvnjxgxyggtjgnrkerdllgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455370.4746702-793-190162565792073/AnsiballZ_systemd.py'
Jan 26 19:22:50 compute-0 sudo[140868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:51 compute-0 python3.9[140870]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:51 compute-0 sudo[140868]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:51 compute-0 sudo[141038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgftuqgqmiewlmdbyrivsryldjccvhty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455371.3462107-793-268152988195170/AnsiballZ_systemd.py'
Jan 26 19:22:51 compute-0 sudo[141038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:51 compute-0 podman[140997]: 2026-01-26 19:22:51.734490059 +0000 UTC m=+0.133814990 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 19:22:51 compute-0 python3.9[141045]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:52 compute-0 sudo[141038]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:52 compute-0 sudo[141204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yytvvbtaaeqcnzjadvlhhzugoiyhpymx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455372.24358-793-272593960745746/AnsiballZ_systemd.py'
Jan 26 19:22:52 compute-0 sudo[141204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:52 compute-0 python3.9[141206]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:53 compute-0 sudo[141204]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:53 compute-0 sudo[141359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwrdgsugtaaexvpsspfhilgaqvakwucw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455373.214944-793-269393387957063/AnsiballZ_systemd.py'
Jan 26 19:22:53 compute-0 sudo[141359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:53 compute-0 python3.9[141361]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:53 compute-0 sudo[141359]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:54 compute-0 sudo[141514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iajrxsizslnwkartsaavzyceofydxijn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455374.1375983-793-6047722158047/AnsiballZ_systemd.py'
Jan 26 19:22:54 compute-0 sudo[141514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:54 compute-0 python3.9[141516]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:54 compute-0 sudo[141514]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:55 compute-0 sudo[141669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnytmabajfgqvzobxaruktelzldtvoau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455375.1009333-793-46341260429210/AnsiballZ_systemd.py'
Jan 26 19:22:55 compute-0 sudo[141669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:55 compute-0 python3.9[141671]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:55 compute-0 sudo[141669]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:56 compute-0 sudo[141824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tomhvbzfigdaduhxmzyzpiivxvntttun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455376.009225-793-33119449141664/AnsiballZ_systemd.py'
Jan 26 19:22:56 compute-0 sudo[141824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:56 compute-0 podman[141826]: 2026-01-26 19:22:56.424949718 +0000 UTC m=+0.059021914 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 19:22:56 compute-0 python3.9[141827]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:56 compute-0 sudo[141824]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:57 compute-0 sudo[141998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-artoyxsgqrfnpogmjyxadivaertqxbzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455376.8951354-793-139906944557582/AnsiballZ_systemd.py'
Jan 26 19:22:57 compute-0 sudo[141998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:57 compute-0 python3.9[142000]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:58 compute-0 sudo[141998]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:59 compute-0 sudo[142153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omelpmjhltqsqdetjhamshkugoywijhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455378.8194077-793-82932095741451/AnsiballZ_systemd.py'
Jan 26 19:22:59 compute-0 sudo[142153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:22:59 compute-0 python3.9[142155]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:22:59 compute-0 sudo[142153]: pam_unix(sudo:session): session closed for user root
Jan 26 19:22:59 compute-0 sudo[142308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohokqrdqsvbsrwfacrlzrlzxwnczydsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455379.6561313-793-271078345759763/AnsiballZ_systemd.py'
Jan 26 19:22:59 compute-0 sudo[142308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:00 compute-0 python3.9[142310]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 26 19:23:00 compute-0 sudo[142308]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:01 compute-0 sudo[142463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfvxprdrizexkebgqwvupbalpcmgltuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455380.7577536-997-10014664385245/AnsiballZ_file.py'
Jan 26 19:23:01 compute-0 sudo[142463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:01 compute-0 python3.9[142465]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:23:01 compute-0 sudo[142463]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:01 compute-0 sudo[142615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhzzhoazygzyjtlbnqryriwhslhnixrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455381.4830852-997-143256336969685/AnsiballZ_file.py'
Jan 26 19:23:01 compute-0 sudo[142615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:02 compute-0 python3.9[142617]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:23:02 compute-0 sudo[142615]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:02 compute-0 sudo[142767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdivxsilolufyllddrfbvjvanjvzybmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455382.464273-997-109255976743786/AnsiballZ_file.py'
Jan 26 19:23:02 compute-0 sudo[142767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:02 compute-0 python3.9[142769]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:23:02 compute-0 sudo[142767]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:03 compute-0 sudo[142919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spmdorxffqfgntrwwhuhyjcuiziqofwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455383.1553578-997-21960284400675/AnsiballZ_file.py'
Jan 26 19:23:03 compute-0 sudo[142919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:03 compute-0 python3.9[142921]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:23:03 compute-0 sudo[142919]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:04 compute-0 sudo[143071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmsqfbhxfvguonkiqzgweycdjsygodug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455383.9267797-997-270559430544106/AnsiballZ_file.py'
Jan 26 19:23:04 compute-0 sudo[143071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:04 compute-0 python3.9[143073]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:23:04 compute-0 sudo[143071]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:04 compute-0 sudo[143223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynpjlhepelfprpeedovkvbkxnkxmroro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455384.6384501-997-272987782609726/AnsiballZ_file.py'
Jan 26 19:23:04 compute-0 sudo[143223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:05 compute-0 python3.9[143225]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:23:05 compute-0 sudo[143223]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:05 compute-0 python3.9[143375]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:23:06 compute-0 sudo[143525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tivmbucegtlyhtsvrwogkfkjechgczmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455386.187474-1099-103656706699183/AnsiballZ_stat.py'
Jan 26 19:23:06 compute-0 sudo[143525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:06 compute-0 python3.9[143527]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:06 compute-0 sudo[143525]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:07 compute-0 sudo[143650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pddipzeaatrbxptaakutubmidndijpxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455386.187474-1099-103656706699183/AnsiballZ_copy.py'
Jan 26 19:23:07 compute-0 sudo[143650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:07 compute-0 python3.9[143652]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769455386.187474-1099-103656706699183/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:07 compute-0 sudo[143650]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:08 compute-0 sudo[143802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frbvdsojznfkzqpinutwhdqazcaubpej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455387.9218037-1099-81283483500111/AnsiballZ_stat.py'
Jan 26 19:23:08 compute-0 sudo[143802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:08 compute-0 python3.9[143804]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:08 compute-0 sudo[143802]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:08 compute-0 sudo[143927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhymsidiygjsdshzwmglfpnuvdattmgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455387.9218037-1099-81283483500111/AnsiballZ_copy.py'
Jan 26 19:23:08 compute-0 sudo[143927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:09 compute-0 python3.9[143929]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769455387.9218037-1099-81283483500111/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:09 compute-0 sudo[143927]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:09 compute-0 sudo[144079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymdgrolkjoifzkjqbrmmaypjzlebmuyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455389.371908-1099-121904717140400/AnsiballZ_stat.py'
Jan 26 19:23:09 compute-0 sudo[144079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:09 compute-0 python3.9[144081]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:10 compute-0 sudo[144079]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:10 compute-0 sudo[144204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofpnixwpzqxvofbggggapnryeeiaqfer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455389.371908-1099-121904717140400/AnsiballZ_copy.py'
Jan 26 19:23:10 compute-0 sudo[144204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:10 compute-0 python3.9[144206]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769455389.371908-1099-121904717140400/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:10 compute-0 sudo[144204]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:11 compute-0 sudo[144356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpcyayzcuqcdzpulvstvskcamysqsirk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455390.8762033-1099-163262461925043/AnsiballZ_stat.py'
Jan 26 19:23:11 compute-0 sudo[144356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:11 compute-0 python3.9[144358]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:11 compute-0 sudo[144356]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:12 compute-0 sudo[144481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmcydgfxnravhtsbgivphbxghagdkjuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455390.8762033-1099-163262461925043/AnsiballZ_copy.py'
Jan 26 19:23:12 compute-0 sudo[144481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:12 compute-0 python3.9[144483]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769455390.8762033-1099-163262461925043/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:12 compute-0 sudo[144481]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:12 compute-0 sudo[144633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhiruegrqbajtnbrlaicbtsusjiqajhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455392.4509397-1099-62649241904249/AnsiballZ_stat.py'
Jan 26 19:23:12 compute-0 sudo[144633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:13 compute-0 python3.9[144635]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:13 compute-0 sudo[144633]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:13 compute-0 sudo[144758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjtlhajtzbyaxympwnuyldzfwpijpbim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455392.4509397-1099-62649241904249/AnsiballZ_copy.py'
Jan 26 19:23:13 compute-0 sudo[144758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:13 compute-0 python3.9[144760]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769455392.4509397-1099-62649241904249/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:13 compute-0 sudo[144758]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:14 compute-0 sudo[144910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-konaokukipxiezhzlfuivsnmbxtzmxuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455393.9644887-1099-162434379362234/AnsiballZ_stat.py'
Jan 26 19:23:14 compute-0 sudo[144910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:14 compute-0 python3.9[144912]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:14 compute-0 sudo[144910]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:15 compute-0 sudo[145035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxqsrvndagcdudnvxmzqjameuoikhjoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455393.9644887-1099-162434379362234/AnsiballZ_copy.py'
Jan 26 19:23:15 compute-0 sudo[145035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:15 compute-0 python3.9[145037]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769455393.9644887-1099-162434379362234/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:15 compute-0 sudo[145035]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:15 compute-0 sudo[145187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdpkauhszyoqccogmdnsvswctsnfaqkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455395.449896-1099-141345159369243/AnsiballZ_stat.py'
Jan 26 19:23:15 compute-0 sudo[145187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:16 compute-0 python3.9[145189]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:16 compute-0 sudo[145187]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:16 compute-0 sudo[145310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pekislkraxpjviqmdjokkscsyzrfdobp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455395.449896-1099-141345159369243/AnsiballZ_copy.py'
Jan 26 19:23:16 compute-0 sudo[145310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:16 compute-0 python3.9[145312]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769455395.449896-1099-141345159369243/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:16 compute-0 sudo[145310]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:17 compute-0 sudo[145462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkhomlkuuxrvartdkfmdpjrpdhberntl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455396.9772537-1099-184935695680818/AnsiballZ_stat.py'
Jan 26 19:23:17 compute-0 sudo[145462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:17 compute-0 python3.9[145464]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:17 compute-0 sudo[145462]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:18 compute-0 sudo[145587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltjdrxkpkugykbjietmmaeljdiotrsqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455396.9772537-1099-184935695680818/AnsiballZ_copy.py'
Jan 26 19:23:18 compute-0 sudo[145587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:18 compute-0 python3.9[145589]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769455396.9772537-1099-184935695680818/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:18 compute-0 sudo[145587]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:18 compute-0 sudo[145739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzlvrextpxnshevmwpfrjbkpzemxtnhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455398.5106852-1325-276940582867378/AnsiballZ_command.py'
Jan 26 19:23:18 compute-0 sudo[145739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:19 compute-0 python3.9[145741]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 26 19:23:19 compute-0 sudo[145739]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:19 compute-0 sudo[145892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpoxeswmoaieimnrcsihcdrtnjsppxmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455399.4242663-1343-255750392336733/AnsiballZ_file.py'
Jan 26 19:23:19 compute-0 sudo[145892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:20 compute-0 python3.9[145894]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:20 compute-0 sudo[145892]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:20 compute-0 sudo[146044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyrvurvxakppvcjflzwylqbqcgujndgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455400.2496278-1343-202974648465473/AnsiballZ_file.py'
Jan 26 19:23:20 compute-0 sudo[146044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:20 compute-0 python3.9[146046]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:20 compute-0 sudo[146044]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:21 compute-0 sudo[146196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyeitbnkadcefljbgawsfbwlucjqljxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455401.0272222-1343-65331718511071/AnsiballZ_file.py'
Jan 26 19:23:21 compute-0 sudo[146196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:21 compute-0 python3.9[146198]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:21 compute-0 sudo[146196]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:22 compute-0 sudo[146358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyevturocpyclxndgzofrprkqzuabvju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455401.8722475-1343-272495920675589/AnsiballZ_file.py'
Jan 26 19:23:22 compute-0 sudo[146358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:22 compute-0 podman[146322]: 2026-01-26 19:23:22.372985111 +0000 UTC m=+0.145191640 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 26 19:23:22 compute-0 python3.9[146366]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:22 compute-0 sudo[146358]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:23 compute-0 sudo[146523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-golymrwvannltfjkresnnndykossjkya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455402.708324-1343-263279827739508/AnsiballZ_file.py'
Jan 26 19:23:23 compute-0 sudo[146523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:23 compute-0 python3.9[146525]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:23 compute-0 sudo[146523]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:23 compute-0 sudo[146675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mklmytqtbjvshwmziweaiblmlsvxmtcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455403.485646-1343-37899719613303/AnsiballZ_file.py'
Jan 26 19:23:23 compute-0 sudo[146675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:23:23.990 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:23:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:23:23.991 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:23:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:23:23.991 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:23:24 compute-0 python3.9[146677]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:24 compute-0 sudo[146675]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:24 compute-0 sudo[146828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-martqbephiervfgvttbwkubrsmcxjkhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455404.193637-1343-149834717905145/AnsiballZ_file.py'
Jan 26 19:23:24 compute-0 sudo[146828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:24 compute-0 python3.9[146830]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:24 compute-0 sudo[146828]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:25 compute-0 sudo[146980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwwwmwtdpwmnttklwrsmnndyaxpfpypm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455404.9498658-1343-239480058274445/AnsiballZ_file.py'
Jan 26 19:23:25 compute-0 sudo[146980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:25 compute-0 python3.9[146982]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:25 compute-0 sudo[146980]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:26 compute-0 sudo[147132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftwqkdqwripqfndiuiywtxorwnoffujl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455405.728434-1343-272513283272453/AnsiballZ_file.py'
Jan 26 19:23:26 compute-0 sudo[147132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:26 compute-0 python3.9[147134]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:26 compute-0 sudo[147132]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:26 compute-0 sudo[147298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcqjdoallwykztacigbspcqwqrvthdth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455406.5815077-1343-269440172674919/AnsiballZ_file.py'
Jan 26 19:23:26 compute-0 sudo[147298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:26 compute-0 podman[147258]: 2026-01-26 19:23:26.993035666 +0000 UTC m=+0.075779945 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120)
Jan 26 19:23:27 compute-0 python3.9[147306]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:27 compute-0 sudo[147298]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:27 compute-0 sudo[147457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qponopiccxwhdbgbtqbeqouxtkmacthl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455407.41567-1343-81465118102584/AnsiballZ_file.py'
Jan 26 19:23:27 compute-0 sudo[147457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:28 compute-0 python3.9[147459]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:28 compute-0 sudo[147457]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:28 compute-0 sudo[147609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acyigxetouftqessfhwiocjgzgwhnsnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455408.2188544-1343-95324658092075/AnsiballZ_file.py'
Jan 26 19:23:28 compute-0 sudo[147609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:28 compute-0 python3.9[147611]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:28 compute-0 sudo[147609]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:29 compute-0 sudo[147761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tffiitbgtnyxgxefhkuiexnkkgqompym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455409.0311964-1343-42215837462674/AnsiballZ_file.py'
Jan 26 19:23:29 compute-0 sudo[147761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:29 compute-0 python3.9[147763]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:29 compute-0 sudo[147761]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:30 compute-0 sudo[147913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqhxflxfmbdfjcqdsliokcrlutfnscoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455409.7871447-1343-224976218898618/AnsiballZ_file.py'
Jan 26 19:23:30 compute-0 sudo[147913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:30 compute-0 python3.9[147915]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:30 compute-0 sudo[147913]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:30 compute-0 sudo[148065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxlbwondwnirmzqubuemiaafcxrvhpqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455410.6550477-1541-254914018034035/AnsiballZ_stat.py'
Jan 26 19:23:31 compute-0 sudo[148065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:31 compute-0 python3.9[148067]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:31 compute-0 sudo[148065]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:31 compute-0 sudo[148188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fipbnvfocowlzagdfcytayopmifjgcvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455410.6550477-1541-254914018034035/AnsiballZ_copy.py'
Jan 26 19:23:31 compute-0 sudo[148188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:31 compute-0 python3.9[148190]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455410.6550477-1541-254914018034035/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:31 compute-0 sudo[148188]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:32 compute-0 sudo[148340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fozwicnyykbipyfxsvcfvvxqxwdvvgxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455411.9899302-1541-33289972099002/AnsiballZ_stat.py'
Jan 26 19:23:32 compute-0 sudo[148340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:32 compute-0 python3.9[148342]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:32 compute-0 sudo[148340]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:33 compute-0 sudo[148463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcrvyifuupxwlmqtldepdrywpwlrjfcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455411.9899302-1541-33289972099002/AnsiballZ_copy.py'
Jan 26 19:23:33 compute-0 sudo[148463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:33 compute-0 python3.9[148465]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455411.9899302-1541-33289972099002/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:33 compute-0 sudo[148463]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:33 compute-0 sudo[148615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwogfneelsjupqjqhqwarmglymmddvlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455413.5334852-1541-224797128601147/AnsiballZ_stat.py'
Jan 26 19:23:33 compute-0 sudo[148615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:34 compute-0 python3.9[148617]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:34 compute-0 sudo[148615]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:34 compute-0 sudo[148738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeqsmnbscwfhxwqogbzsoznjqzonjtyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455413.5334852-1541-224797128601147/AnsiballZ_copy.py'
Jan 26 19:23:34 compute-0 sudo[148738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:34 compute-0 python3.9[148740]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455413.5334852-1541-224797128601147/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:34 compute-0 sudo[148738]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:35 compute-0 sudo[148890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-infxmlbacpkycxwbhezchzdexsnrgqhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455415.0869877-1541-117675488852563/AnsiballZ_stat.py'
Jan 26 19:23:35 compute-0 sudo[148890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:35 compute-0 python3.9[148892]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:35 compute-0 sudo[148890]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:36 compute-0 sudo[149013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyntcxajqukzmcbmogrkfkyrsahevdzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455415.0869877-1541-117675488852563/AnsiballZ_copy.py'
Jan 26 19:23:36 compute-0 sudo[149013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:36 compute-0 python3.9[149015]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455415.0869877-1541-117675488852563/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:36 compute-0 sudo[149013]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:36 compute-0 sudo[149165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scbzebyynzjanooklzrnsrawtftnpodt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455416.462858-1541-273182244277009/AnsiballZ_stat.py'
Jan 26 19:23:36 compute-0 sudo[149165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:36 compute-0 python3.9[149167]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:37 compute-0 sudo[149165]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:37 compute-0 sudo[149288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaaqxbjbtbwlemciypfmocwskaombmrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455416.462858-1541-273182244277009/AnsiballZ_copy.py'
Jan 26 19:23:37 compute-0 sudo[149288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:37 compute-0 python3.9[149290]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455416.462858-1541-273182244277009/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:37 compute-0 sudo[149288]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:38 compute-0 sudo[149440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfrrsqwshwkgughzngmijegkmvwrlqbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455417.8365974-1541-16766891858206/AnsiballZ_stat.py'
Jan 26 19:23:38 compute-0 sudo[149440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:38 compute-0 python3.9[149442]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:38 compute-0 sudo[149440]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:39 compute-0 sudo[149563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfetqxmbvkbmvmdntafvliopmqomrhpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455417.8365974-1541-16766891858206/AnsiballZ_copy.py'
Jan 26 19:23:39 compute-0 sudo[149563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:39 compute-0 python3.9[149565]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455417.8365974-1541-16766891858206/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:39 compute-0 sudo[149563]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:39 compute-0 sudo[149715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hujfdjekabtzcfhipebiumhxqxvtuvqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455419.4600387-1541-205430254592246/AnsiballZ_stat.py'
Jan 26 19:23:39 compute-0 sudo[149715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:40 compute-0 python3.9[149717]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:40 compute-0 sudo[149715]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:40 compute-0 sudo[149838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgpndnnsmacyngaoikyvehvepbdgsfbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455419.4600387-1541-205430254592246/AnsiballZ_copy.py'
Jan 26 19:23:40 compute-0 sudo[149838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:40 compute-0 python3.9[149840]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455419.4600387-1541-205430254592246/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:40 compute-0 sudo[149838]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:41 compute-0 sudo[149990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxrnjbxuasthztxyoukkpbjorflccawv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455420.9610968-1541-181972967103212/AnsiballZ_stat.py'
Jan 26 19:23:41 compute-0 sudo[149990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:41 compute-0 python3.9[149992]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:41 compute-0 sudo[149990]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:42 compute-0 sudo[150113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvizybuayenwwxjyoyzlxtdqbtgqexvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455420.9610968-1541-181972967103212/AnsiballZ_copy.py'
Jan 26 19:23:42 compute-0 sudo[150113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:42 compute-0 python3.9[150115]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455420.9610968-1541-181972967103212/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:42 compute-0 sudo[150113]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:42 compute-0 sudo[150265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfkkmjmdnwaamujucofluatvuawtwbni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455422.450643-1541-19774719934221/AnsiballZ_stat.py'
Jan 26 19:23:42 compute-0 sudo[150265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:43 compute-0 python3.9[150267]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:43 compute-0 sudo[150265]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:43 compute-0 sudo[150388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnutfzuxexbabhpmbhhjyddstnzogsbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455422.450643-1541-19774719934221/AnsiballZ_copy.py'
Jan 26 19:23:43 compute-0 sudo[150388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:43 compute-0 python3.9[150390]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455422.450643-1541-19774719934221/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:43 compute-0 sudo[150388]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:44 compute-0 sudo[150540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaazxrtnrvtkfnroskalntyvqeknzyya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455423.922288-1541-264110301002718/AnsiballZ_stat.py'
Jan 26 19:23:44 compute-0 sudo[150540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:44 compute-0 python3.9[150542]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:44 compute-0 sudo[150540]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:45 compute-0 sudo[150663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luzpujwwurbxcpoeafsttvqwcimmqfql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455423.922288-1541-264110301002718/AnsiballZ_copy.py'
Jan 26 19:23:45 compute-0 sudo[150663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:45 compute-0 python3.9[150665]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455423.922288-1541-264110301002718/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:45 compute-0 sudo[150663]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:45 compute-0 sudo[150815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kajaicysdtdsiwrltrvtxxhbmmzaadix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455425.4205225-1541-185007696448804/AnsiballZ_stat.py'
Jan 26 19:23:45 compute-0 sudo[150815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:46 compute-0 python3.9[150817]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:46 compute-0 sudo[150815]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:46 compute-0 sudo[150938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzzrwwayheqcjiykafvouvtvjczqieeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455425.4205225-1541-185007696448804/AnsiballZ_copy.py'
Jan 26 19:23:46 compute-0 sudo[150938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:46 compute-0 python3.9[150940]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455425.4205225-1541-185007696448804/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:46 compute-0 sudo[150938]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:47 compute-0 sudo[151090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgsxaiaedvakrpugdulxszrjxflrydzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455426.948318-1541-103169887159863/AnsiballZ_stat.py'
Jan 26 19:23:47 compute-0 sudo[151090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:47 compute-0 python3.9[151092]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:47 compute-0 sudo[151090]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:47 compute-0 sudo[151213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnowqkxqnkynzggggprbvsodcybuomrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455426.948318-1541-103169887159863/AnsiballZ_copy.py'
Jan 26 19:23:47 compute-0 sudo[151213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:48 compute-0 python3.9[151215]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455426.948318-1541-103169887159863/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:48 compute-0 sudo[151213]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:48 compute-0 sudo[151365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfwdlimknblhosgxdtfyhzwjsvnmpuxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455428.3718762-1541-188611961611468/AnsiballZ_stat.py'
Jan 26 19:23:48 compute-0 sudo[151365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:48 compute-0 python3.9[151367]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:48 compute-0 sudo[151365]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:49 compute-0 sudo[151488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvvkihdjhucwvkfhklbcpwfklfodakqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455428.3718762-1541-188611961611468/AnsiballZ_copy.py'
Jan 26 19:23:49 compute-0 sudo[151488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:49 compute-0 python3.9[151490]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455428.3718762-1541-188611961611468/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:49 compute-0 sudo[151488]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:50 compute-0 sudo[151640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eojyvagarhdpwcyiwnevspmnonmroxlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455429.7973185-1541-200414896886288/AnsiballZ_stat.py'
Jan 26 19:23:50 compute-0 sudo[151640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:50 compute-0 python3.9[151642]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:23:50 compute-0 sudo[151640]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:50 compute-0 sudo[151763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-catuqddqglooauyjgykuibfpzfblhvga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455429.7973185-1541-200414896886288/AnsiballZ_copy.py'
Jan 26 19:23:50 compute-0 sudo[151763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:51 compute-0 python3.9[151765]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455429.7973185-1541-200414896886288/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:51 compute-0 sudo[151763]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:51 compute-0 python3.9[151915]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:23:52 compute-0 sudo[152086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jccijimxvixfqdazkofofvknlazsjfrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455432.28794-1953-85827313205083/AnsiballZ_seboolean.py'
Jan 26 19:23:52 compute-0 sudo[152086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:52 compute-0 podman[152042]: 2026-01-26 19:23:52.987121896 +0000 UTC m=+0.175385041 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:23:53 compute-0 python3.9[152092]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 26 19:23:54 compute-0 sudo[152086]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:54 compute-0 sudo[152250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otqtfzpimzipstnxeppmteolpdagvxct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455434.6176481-1969-144092260060882/AnsiballZ_copy.py'
Jan 26 19:23:54 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 26 19:23:54 compute-0 sudo[152250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:55 compute-0 python3.9[152252]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:55 compute-0 sudo[152250]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:55 compute-0 sudo[152402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jigpkcywdtdsqegoqtmnsoksuyzdtjsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455435.3950639-1969-217919188287200/AnsiballZ_copy.py'
Jan 26 19:23:55 compute-0 sudo[152402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:55 compute-0 python3.9[152404]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:55 compute-0 sudo[152402]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:56 compute-0 sudo[152554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukqanbfgxcducmjdlgzohzwygwrmbnys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455436.1448069-1969-9215674182388/AnsiballZ_copy.py'
Jan 26 19:23:56 compute-0 sudo[152554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:56 compute-0 python3.9[152556]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:56 compute-0 sudo[152554]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:57 compute-0 sudo[152718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txzdihipkteashnytjvjcbvqxjroapmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455436.893489-1969-59781488847063/AnsiballZ_copy.py'
Jan 26 19:23:57 compute-0 sudo[152718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:57 compute-0 podman[152680]: 2026-01-26 19:23:57.222820116 +0000 UTC m=+0.068677376 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 19:23:57 compute-0 python3.9[152724]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:57 compute-0 sudo[152718]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:57 compute-0 sudo[152879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxlrcqxfujbhrazcwapzpttkkrwguauw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455437.5869167-1969-23970731279302/AnsiballZ_copy.py'
Jan 26 19:23:57 compute-0 sudo[152879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:58 compute-0 sshd-session[152758]: Invalid user loginuser from 193.32.162.151 port 33098
Jan 26 19:23:58 compute-0 python3.9[152881]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:58 compute-0 sudo[152879]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:58 compute-0 sshd-session[152758]: Connection closed by invalid user loginuser 193.32.162.151 port 33098 [preauth]
Jan 26 19:23:58 compute-0 sudo[153031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekjercjiosmyvkdxhzjhjbogofapaupn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455438.2941163-2041-131302038896018/AnsiballZ_copy.py'
Jan 26 19:23:58 compute-0 sudo[153031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:58 compute-0 python3.9[153033]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:58 compute-0 sudo[153031]: pam_unix(sudo:session): session closed for user root
Jan 26 19:23:59 compute-0 sudo[153183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bafaqdusgoyclqoffrqldmgenixbtmge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455439.0696483-2041-279486200353801/AnsiballZ_copy.py'
Jan 26 19:23:59 compute-0 sudo[153183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:23:59 compute-0 python3.9[153185]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:23:59 compute-0 sudo[153183]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:00 compute-0 sudo[153335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfbfbclkrfpxjwlbjexbsbyuasxeturc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455439.8115246-2041-73627798609524/AnsiballZ_copy.py'
Jan 26 19:24:00 compute-0 sudo[153335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:00 compute-0 python3.9[153337]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:00 compute-0 sudo[153335]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:00 compute-0 sudo[153487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzvaoxvzrrjxuparmolwolkrcftekxly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455440.502653-2041-190972611852814/AnsiballZ_copy.py'
Jan 26 19:24:00 compute-0 sudo[153487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:01 compute-0 python3.9[153489]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:01 compute-0 sudo[153487]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:01 compute-0 sudo[153639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvnxtouepqhsdgkoaygasturtlrupfsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455441.3904781-2041-216611427710000/AnsiballZ_copy.py'
Jan 26 19:24:01 compute-0 sudo[153639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:02 compute-0 python3.9[153641]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:02 compute-0 sudo[153639]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:02 compute-0 sudo[153791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdrhzvvypmrnhmyqluvwynxoemkkntcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455442.2744932-2113-56511334568107/AnsiballZ_systemd.py'
Jan 26 19:24:02 compute-0 sudo[153791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:03 compute-0 python3.9[153793]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:24:03 compute-0 systemd[1]: Reloading.
Jan 26 19:24:03 compute-0 systemd-rc-local-generator[153822]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:24:03 compute-0 systemd-sysv-generator[153825]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:24:03 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Jan 26 19:24:03 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Jan 26 19:24:03 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 26 19:24:03 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 26 19:24:03 compute-0 systemd[1]: Starting libvirt logging daemon...
Jan 26 19:24:03 compute-0 systemd[1]: Started libvirt logging daemon.
Jan 26 19:24:03 compute-0 sudo[153791]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:04 compute-0 sudo[153985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnffttsliqqdxbdigpypssrbnqwufiwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455443.7789977-2113-167268839414736/AnsiballZ_systemd.py'
Jan 26 19:24:04 compute-0 sudo[153985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:04 compute-0 python3.9[153987]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:24:04 compute-0 systemd[1]: Reloading.
Jan 26 19:24:04 compute-0 systemd-rc-local-generator[154015]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:24:04 compute-0 systemd-sysv-generator[154019]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:24:04 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 26 19:24:04 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 26 19:24:04 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 26 19:24:04 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 26 19:24:04 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 26 19:24:04 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 26 19:24:04 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 26 19:24:04 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 26 19:24:04 compute-0 sudo[153985]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:05 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 26 19:24:05 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 26 19:24:05 compute-0 sudo[154201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovycwjoakbfzkcufgxvkfizppgozuygj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455445.1251087-2113-97589277512519/AnsiballZ_systemd.py'
Jan 26 19:24:05 compute-0 sudo[154201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:05 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 26 19:24:05 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 26 19:24:05 compute-0 python3.9[154204]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:24:05 compute-0 systemd[1]: Reloading.
Jan 26 19:24:05 compute-0 systemd-sysv-generator[154242]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:24:05 compute-0 systemd-rc-local-generator[154237]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:24:06 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 26 19:24:06 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 26 19:24:06 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 26 19:24:06 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 26 19:24:06 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 26 19:24:06 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 26 19:24:06 compute-0 sudo[154201]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:06 compute-0 setroubleshoot[154075]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 8444c963-9fff-450d-bfd8-e190ad13dd19
Jan 26 19:24:06 compute-0 setroubleshoot[154075]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 26 19:24:06 compute-0 setroubleshoot[154075]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 8444c963-9fff-450d-bfd8-e190ad13dd19
Jan 26 19:24:06 compute-0 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 19:24:06 compute-0 setroubleshoot[154075]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 26 19:24:06 compute-0 sudo[154423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vykhbbeaudfjolfxwwdivllanfrxleqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455446.4227705-2113-10791412430532/AnsiballZ_systemd.py'
Jan 26 19:24:06 compute-0 sudo[154423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:07 compute-0 python3.9[154425]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:24:07 compute-0 systemd[1]: Reloading.
Jan 26 19:24:07 compute-0 systemd-rc-local-generator[154454]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:24:07 compute-0 systemd-sysv-generator[154458]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:24:07 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Jan 26 19:24:07 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 26 19:24:07 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 26 19:24:07 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 26 19:24:07 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 26 19:24:07 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 26 19:24:07 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 26 19:24:07 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 26 19:24:07 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 26 19:24:07 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 26 19:24:07 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 26 19:24:07 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 26 19:24:07 compute-0 sudo[154423]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:08 compute-0 sudo[154638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujzrbxgbhbzzdmswxmjptulukwazycmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455447.769438-2113-173439806724961/AnsiballZ_systemd.py'
Jan 26 19:24:08 compute-0 sudo[154638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:08 compute-0 python3.9[154640]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:24:08 compute-0 systemd[1]: Reloading.
Jan 26 19:24:08 compute-0 systemd-rc-local-generator[154668]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:24:08 compute-0 systemd-sysv-generator[154671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:24:08 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Jan 26 19:24:08 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Jan 26 19:24:08 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 26 19:24:08 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 26 19:24:08 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 26 19:24:08 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 26 19:24:08 compute-0 systemd[1]: Starting libvirt secret daemon...
Jan 26 19:24:08 compute-0 systemd[1]: Started libvirt secret daemon.
Jan 26 19:24:08 compute-0 sudo[154638]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:09 compute-0 sudo[154850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsnsuayuypdtyoonyotuyjtagfwlxisl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455449.2744281-2187-236826658968456/AnsiballZ_file.py'
Jan 26 19:24:09 compute-0 sudo[154850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:09 compute-0 python3.9[154852]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:09 compute-0 sudo[154850]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:10 compute-0 sudo[155003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npadjybjaeetnwydqaamvbamfwiqzida ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455450.0726428-2203-12233290882714/AnsiballZ_find.py'
Jan 26 19:24:10 compute-0 sudo[155003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:10 compute-0 python3.9[155005]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 19:24:10 compute-0 sudo[155003]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:11 compute-0 sudo[155155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzxwslyjrveqczbiusvratqdnetyqhsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455451.2782824-2231-80699381090927/AnsiballZ_stat.py'
Jan 26 19:24:11 compute-0 sudo[155155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:11 compute-0 python3.9[155157]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:24:11 compute-0 sudo[155155]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:12 compute-0 sudo[155278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugwcugwflibrvahzafbligrrdxumhilc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455451.2782824-2231-80699381090927/AnsiballZ_copy.py'
Jan 26 19:24:12 compute-0 sudo[155278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:12 compute-0 python3.9[155280]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455451.2782824-2231-80699381090927/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:12 compute-0 sudo[155278]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:13 compute-0 sudo[155430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssthslhpbmobdcnvjgumwrsiprbqeyri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455452.9769485-2263-225335654415167/AnsiballZ_file.py'
Jan 26 19:24:13 compute-0 sudo[155430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:13 compute-0 python3.9[155432]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:13 compute-0 sudo[155430]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:14 compute-0 sudo[155582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvzpbxbnuacaqwmrxzrkkbjlivzahyta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455453.874051-2279-201507345807003/AnsiballZ_stat.py'
Jan 26 19:24:14 compute-0 sudo[155582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:14 compute-0 python3.9[155584]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:24:14 compute-0 sudo[155582]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:14 compute-0 sudo[155660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irfmbqacxmidbpgvomtmbxxrbwvhpesg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455453.874051-2279-201507345807003/AnsiballZ_file.py'
Jan 26 19:24:14 compute-0 sudo[155660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:15 compute-0 python3.9[155662]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:15 compute-0 sudo[155660]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:15 compute-0 sudo[155812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nynifmtzdttapjmlnywgsytrycyeaanr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455455.3485503-2303-75435926457896/AnsiballZ_stat.py'
Jan 26 19:24:15 compute-0 sudo[155812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:15 compute-0 python3.9[155814]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:24:15 compute-0 sudo[155812]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:16 compute-0 sudo[155890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzztrvlwpefpfcnqhceeowfpncfhbkbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455455.3485503-2303-75435926457896/AnsiballZ_file.py'
Jan 26 19:24:16 compute-0 sudo[155890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:16 compute-0 python3.9[155892]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.3jp9z3xu recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:16 compute-0 sudo[155890]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:16 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 26 19:24:16 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 26 19:24:17 compute-0 sudo[156042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywxbhuqvjptfaurumzjdkyydpqvctvff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455456.6165957-2327-175026856690127/AnsiballZ_stat.py'
Jan 26 19:24:17 compute-0 sudo[156042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:17 compute-0 python3.9[156044]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:24:17 compute-0 sudo[156042]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:17 compute-0 sudo[156120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bflgmoiwjkecpsjrpzwfbhupoybwpzhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455456.6165957-2327-175026856690127/AnsiballZ_file.py'
Jan 26 19:24:17 compute-0 sudo[156120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:17 compute-0 python3.9[156122]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:17 compute-0 sudo[156120]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:18 compute-0 sudo[156272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvrhrkljqsdmhanulfdzgxrmaxjfydsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455458.0991163-2353-54729863413310/AnsiballZ_command.py'
Jan 26 19:24:18 compute-0 sudo[156272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:18 compute-0 python3.9[156274]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:24:18 compute-0 sudo[156272]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:19 compute-0 sudo[156425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydhrhwhximfiyegmoisexoydpspxqmaa ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769455458.9975605-2369-136226511538486/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 19:24:19 compute-0 sudo[156425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:19 compute-0 python3[156427]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 19:24:19 compute-0 sudo[156425]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:20 compute-0 sudo[156577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxnqbcuyhqdkkvhzjqyengsvdmayikaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455459.9349344-2385-149728359092323/AnsiballZ_stat.py'
Jan 26 19:24:20 compute-0 sudo[156577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:20 compute-0 python3.9[156579]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:24:20 compute-0 sudo[156577]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:20 compute-0 sudo[156655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuxidoybpmgqrtobogcqxujonjcybcqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455459.9349344-2385-149728359092323/AnsiballZ_file.py'
Jan 26 19:24:20 compute-0 sudo[156655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:21 compute-0 python3.9[156657]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:21 compute-0 sudo[156655]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:21 compute-0 sudo[156807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsebmuxeryyrvcskjucgpczggxtjlgsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455461.3406153-2409-91851992008373/AnsiballZ_stat.py'
Jan 26 19:24:21 compute-0 sudo[156807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:22 compute-0 python3.9[156809]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:24:22 compute-0 sudo[156807]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:22 compute-0 sudo[156932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xunkopbfmlursifmvmnzquxqzyxkyvos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455461.3406153-2409-91851992008373/AnsiballZ_copy.py'
Jan 26 19:24:22 compute-0 sudo[156932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:22 compute-0 python3.9[156934]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455461.3406153-2409-91851992008373/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:22 compute-0 sudo[156932]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:23 compute-0 podman[157034]: 2026-01-26 19:24:23.429485659 +0000 UTC m=+0.160303721 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 19:24:23 compute-0 sudo[157110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlnlsichayqncolswwckksndnclxbtyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455462.9765506-2439-59048927353640/AnsiballZ_stat.py'
Jan 26 19:24:23 compute-0 sudo[157110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:23 compute-0 python3.9[157112]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:24:23 compute-0 sudo[157110]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:24:23.993 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:24:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:24:23.995 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:24:23 compute-0 sudo[157188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyptfxrkmcltfmajychwxjvkwuxakgbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455462.9765506-2439-59048927353640/AnsiballZ_file.py'
Jan 26 19:24:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:24:23.995 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:24:23 compute-0 sudo[157188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:24 compute-0 python3.9[157191]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:24 compute-0 sudo[157188]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:24 compute-0 sudo[157341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khakrcsttnzyrbiwvvvaolfmdzquvbzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455464.5007894-2463-38432601370020/AnsiballZ_stat.py'
Jan 26 19:24:24 compute-0 sudo[157341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:25 compute-0 python3.9[157343]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:24:25 compute-0 sudo[157341]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:25 compute-0 sudo[157419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hriztgflfkyheowwpxlfylwtqsjpttlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455464.5007894-2463-38432601370020/AnsiballZ_file.py'
Jan 26 19:24:25 compute-0 sudo[157419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:25 compute-0 python3.9[157421]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:25 compute-0 sudo[157419]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:26 compute-0 sudo[157571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbbpndgevuksoycysixfmcmwtfarihyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455466.0410829-2487-132424384612739/AnsiballZ_stat.py'
Jan 26 19:24:26 compute-0 sudo[157571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:26 compute-0 python3.9[157573]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:24:26 compute-0 sudo[157571]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:27 compute-0 sudo[157696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbbkiluhtelmnaqobniylhntfhoujayz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455466.0410829-2487-132424384612739/AnsiballZ_copy.py'
Jan 26 19:24:27 compute-0 sudo[157696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:27 compute-0 podman[157698]: 2026-01-26 19:24:27.362403379 +0000 UTC m=+0.081469218 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 19:24:27 compute-0 python3.9[157699]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455466.0410829-2487-132424384612739/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:27 compute-0 sudo[157696]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:28 compute-0 sudo[157869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npdldysvdmjzndqmvrnuijaujvniajel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455467.723456-2517-63721088717853/AnsiballZ_file.py'
Jan 26 19:24:28 compute-0 sudo[157869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:28 compute-0 python3.9[157871]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:28 compute-0 sudo[157869]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:28 compute-0 sudo[158021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovtkuapurpulnpfumesyeqejpvrdragh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455468.475927-2533-40700221919018/AnsiballZ_command.py'
Jan 26 19:24:28 compute-0 sudo[158021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:29 compute-0 python3.9[158023]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:24:29 compute-0 sudo[158021]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:29 compute-0 sudo[158176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nznqvjfrjjqdrcbgxspldtdqmjzewbbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455469.3401399-2549-138355850364998/AnsiballZ_blockinfile.py'
Jan 26 19:24:29 compute-0 sudo[158176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:30 compute-0 python3.9[158178]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:30 compute-0 sudo[158176]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:30 compute-0 sudo[158328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcflfzrodskytubdwjlrdhuawqagzodo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455470.4384434-2567-169154844856952/AnsiballZ_command.py'
Jan 26 19:24:30 compute-0 sudo[158328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:30 compute-0 python3.9[158330]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:24:31 compute-0 sudo[158328]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:31 compute-0 sudo[158481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asydcjwhrnhuxyskzwizbtxjedgmnqdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455471.23958-2583-233560365926485/AnsiballZ_stat.py'
Jan 26 19:24:31 compute-0 sudo[158481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:31 compute-0 python3.9[158483]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:24:31 compute-0 sudo[158481]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:32 compute-0 sudo[158635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfunctlyyjcnrxesounmgiivupqyrmjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455472.053054-2599-271720822071660/AnsiballZ_command.py'
Jan 26 19:24:32 compute-0 sudo[158635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:32 compute-0 python3.9[158637]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:24:32 compute-0 sudo[158635]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:33 compute-0 sudo[158790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogaalcfxhxzuvwnbnfdfxiaifuqxplxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455472.8314586-2615-20320855532358/AnsiballZ_file.py'
Jan 26 19:24:33 compute-0 sudo[158790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:33 compute-0 python3.9[158792]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:33 compute-0 sudo[158790]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:34 compute-0 sudo[158942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vykhhsljcocmlgduwnmdopkfochlsozg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455473.629737-2631-23293060173601/AnsiballZ_stat.py'
Jan 26 19:24:34 compute-0 sudo[158942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:34 compute-0 python3.9[158944]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:24:34 compute-0 sudo[158942]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:34 compute-0 sudo[159065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sihiocoayvqcdwshurjddgndjemcavyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455473.629737-2631-23293060173601/AnsiballZ_copy.py'
Jan 26 19:24:34 compute-0 sudo[159065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:34 compute-0 python3.9[159067]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455473.629737-2631-23293060173601/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:34 compute-0 sudo[159065]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:35 compute-0 sudo[159217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxxtlhwvltymyceqlyfifrrvhcjoukkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455475.1613722-2661-8015323298260/AnsiballZ_stat.py'
Jan 26 19:24:35 compute-0 sudo[159217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:35 compute-0 python3.9[159219]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:24:35 compute-0 sudo[159217]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:36 compute-0 sudo[159340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szgyzhoekvokqtgxayiywphkwdxogjha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455475.1613722-2661-8015323298260/AnsiballZ_copy.py'
Jan 26 19:24:36 compute-0 sudo[159340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:36 compute-0 python3.9[159342]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455475.1613722-2661-8015323298260/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:36 compute-0 sudo[159340]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:36 compute-0 sudo[159492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikdtmqbzjstnpwazkrgoalhuvmsaiooh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455476.5960023-2691-26684850353062/AnsiballZ_stat.py'
Jan 26 19:24:36 compute-0 sudo[159492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:37 compute-0 python3.9[159494]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:24:37 compute-0 sudo[159492]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:37 compute-0 sudo[159615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekdeonvcoxswefncfnbpoiwyuocvkfow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455476.5960023-2691-26684850353062/AnsiballZ_copy.py'
Jan 26 19:24:37 compute-0 sudo[159615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:37 compute-0 python3.9[159617]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455476.5960023-2691-26684850353062/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:24:37 compute-0 sudo[159615]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:38 compute-0 sudo[159767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqenckcrcveguahibacynzdzvjjxtakd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455478.0880902-2721-218503184081908/AnsiballZ_systemd.py'
Jan 26 19:24:38 compute-0 sudo[159767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:38 compute-0 python3.9[159769]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:24:38 compute-0 systemd[1]: Reloading.
Jan 26 19:24:38 compute-0 systemd-rc-local-generator[159796]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:24:38 compute-0 systemd-sysv-generator[159799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:24:39 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Jan 26 19:24:39 compute-0 sudo[159767]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:39 compute-0 sudo[159957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtnusgrbkggjybxtxqsnivwyaehbxtqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455479.406883-2737-158585256555810/AnsiballZ_systemd.py'
Jan 26 19:24:39 compute-0 sudo[159957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:40 compute-0 python3.9[159959]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 26 19:24:40 compute-0 systemd[1]: Reloading.
Jan 26 19:24:40 compute-0 systemd-rc-local-generator[159986]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:24:40 compute-0 systemd-sysv-generator[159989]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:24:40 compute-0 systemd[1]: Reloading.
Jan 26 19:24:40 compute-0 systemd-rc-local-generator[160020]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:24:40 compute-0 systemd-sysv-generator[160025]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:24:40 compute-0 sudo[159957]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:41 compute-0 sshd-session[105227]: Connection closed by 192.168.122.30 port 57980
Jan 26 19:24:41 compute-0 sshd-session[105224]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:24:41 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Jan 26 19:24:41 compute-0 systemd[1]: session-23.scope: Consumed 3min 54.313s CPU time.
Jan 26 19:24:41 compute-0 systemd-logind[794]: Session 23 logged out. Waiting for processes to exit.
Jan 26 19:24:41 compute-0 systemd-logind[794]: Removed session 23.
Jan 26 19:24:46 compute-0 sshd-session[160058]: Accepted publickey for zuul from 192.168.122.30 port 35026 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:24:46 compute-0 systemd-logind[794]: New session 24 of user zuul.
Jan 26 19:24:46 compute-0 systemd[1]: Started Session 24 of User zuul.
Jan 26 19:24:46 compute-0 sshd-session[160058]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:24:47 compute-0 python3.9[160211]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:24:48 compute-0 python3.9[160365]: ansible-ansible.builtin.service_facts Invoked
Jan 26 19:24:48 compute-0 network[160382]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 19:24:48 compute-0 network[160383]: 'network-scripts' will be removed from distribution in near future.
Jan 26 19:24:48 compute-0 network[160384]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 19:24:53 compute-0 sudo[160667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boribefrkyispkuhxcljaoceepkicmok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455493.4239938-69-275333955336660/AnsiballZ_setup.py'
Jan 26 19:24:53 compute-0 sudo[160667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:53 compute-0 podman[160627]: 2026-01-26 19:24:53.857281109 +0000 UTC m=+0.136641168 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller)
Jan 26 19:24:54 compute-0 python3.9[160674]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 26 19:24:54 compute-0 sudo[160667]: pam_unix(sudo:session): session closed for user root
Jan 26 19:24:54 compute-0 sudo[160761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpltrmwbqjmnwfhwkiyzhovyfhtsciwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455493.4239938-69-275333955336660/AnsiballZ_dnf.py'
Jan 26 19:24:54 compute-0 sudo[160761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:24:55 compute-0 python3.9[160763]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:24:58 compute-0 podman[160766]: 2026-01-26 19:24:58.343933973 +0000 UTC m=+0.081145308 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120)
Jan 26 19:25:00 compute-0 sudo[160761]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:01 compute-0 sudo[160935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgboshacmpmrqxvuvrwutudtoaeywsaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455501.1654458-93-219438943297828/AnsiballZ_stat.py'
Jan 26 19:25:01 compute-0 sudo[160935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:01 compute-0 python3.9[160937]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:25:01 compute-0 sudo[160935]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:02 compute-0 sudo[161087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndmubmdrmcwsgvrlkaveqqfhfeowlkhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455502.2118547-113-98700735939310/AnsiballZ_command.py'
Jan 26 19:25:02 compute-0 sudo[161087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:02 compute-0 python3.9[161089]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:25:02 compute-0 sudo[161087]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:03 compute-0 sudo[161240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmapfdqdcrzmdrldzzvzoqnuzlzuonvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455503.232786-133-55844942007778/AnsiballZ_stat.py'
Jan 26 19:25:03 compute-0 sudo[161240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:03 compute-0 python3.9[161242]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:25:03 compute-0 sudo[161240]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:04 compute-0 sudo[161392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfgsqohysinzenzgxvnlaefueyfrzfch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455504.0443308-149-98511594514716/AnsiballZ_command.py'
Jan 26 19:25:04 compute-0 sudo[161392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:04 compute-0 python3.9[161394]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:25:04 compute-0 sudo[161392]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:05 compute-0 sudo[161545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrhiiiraknyoeztiwfjrhawmbekfcspy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455504.8264296-165-115998704548272/AnsiballZ_stat.py'
Jan 26 19:25:05 compute-0 sudo[161545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:05 compute-0 python3.9[161547]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:25:05 compute-0 sudo[161545]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:05 compute-0 sudo[161668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vntiqxhgflfengyhjnnynbbnsumomkxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455504.8264296-165-115998704548272/AnsiballZ_copy.py'
Jan 26 19:25:05 compute-0 sudo[161668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:06 compute-0 python3.9[161670]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455504.8264296-165-115998704548272/.source.iscsi _original_basename=.52eqydf6 follow=False checksum=02a946d36b2ba8a208e1316350d1af87bde9d67f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:06 compute-0 sudo[161668]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:06 compute-0 sudo[161820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfjdbxtjdppvjyfqxndrvhxglpjvekjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455506.3482919-195-148161742393932/AnsiballZ_file.py'
Jan 26 19:25:06 compute-0 sudo[161820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:07 compute-0 python3.9[161822]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:07 compute-0 sudo[161820]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:07 compute-0 sudo[161972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffalegkzksjnjopmpkmwtegeipsyqviz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455507.2614257-211-100467540725080/AnsiballZ_lineinfile.py'
Jan 26 19:25:07 compute-0 sudo[161972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:07 compute-0 python3.9[161974]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:07 compute-0 sudo[161972]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:08 compute-0 sudo[162124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uycmspxhgdefhmbxbrqmeofefuitafcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455508.1981697-229-60881114536065/AnsiballZ_systemd_service.py'
Jan 26 19:25:08 compute-0 sudo[162124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:09 compute-0 python3.9[162126]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:25:10 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 26 19:25:10 compute-0 sudo[162124]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:10 compute-0 sudo[162280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgqbztdrnjgeiehmigeagzdpprtisqui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455510.5784922-245-237306219272986/AnsiballZ_systemd_service.py'
Jan 26 19:25:10 compute-0 sudo[162280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:11 compute-0 python3.9[162282]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:25:11 compute-0 systemd[1]: Reloading.
Jan 26 19:25:11 compute-0 systemd-rc-local-generator[162311]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:25:11 compute-0 systemd-sysv-generator[162314]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:25:11 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 26 19:25:11 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 26 19:25:11 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Jan 26 19:25:11 compute-0 systemd[1]: Started Open-iSCSI.
Jan 26 19:25:11 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 26 19:25:11 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 26 19:25:11 compute-0 sudo[162280]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:12 compute-0 python3.9[162479]: ansible-ansible.builtin.service_facts Invoked
Jan 26 19:25:12 compute-0 network[162496]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 19:25:12 compute-0 network[162497]: 'network-scripts' will be removed from distribution in near future.
Jan 26 19:25:12 compute-0 network[162498]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 19:25:18 compute-0 sudo[162767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdjonpucyugsidzjleanbkvozdkijxhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455517.956612-291-96801342148724/AnsiballZ_dnf.py'
Jan 26 19:25:18 compute-0 sudo[162767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:18 compute-0 python3.9[162769]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:25:20 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 19:25:20 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 26 19:25:21 compute-0 systemd[1]: Reloading.
Jan 26 19:25:21 compute-0 systemd-rc-local-generator[162815]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:25:21 compute-0 systemd-sysv-generator[162818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:25:21 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 19:25:21 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 19:25:21 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 26 19:25:21 compute-0 systemd[1]: run-r15a89797b8a64c5895cd1a2e27b56c34.service: Deactivated successfully.
Jan 26 19:25:21 compute-0 sudo[162767]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:22 compute-0 sudo[163082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbcbfbektjixxfjgndpjtyogjskytsqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455522.1303935-309-120140137253096/AnsiballZ_file.py'
Jan 26 19:25:22 compute-0 sudo[163082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:22 compute-0 python3.9[163084]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 26 19:25:22 compute-0 sudo[163082]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:23 compute-0 sudo[163234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gauuvjgezfdxvktbtfimslzwphrhelmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455522.9893913-325-262582124470850/AnsiballZ_modprobe.py'
Jan 26 19:25:23 compute-0 sudo[163234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:23 compute-0 python3.9[163236]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 26 19:25:23 compute-0 sudo[163234]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:25:23.997 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:25:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:25:23.999 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:25:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:25:23.999 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:25:24 compute-0 podman[163341]: 2026-01-26 19:25:24.421198312 +0000 UTC m=+0.162438551 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller)
Jan 26 19:25:24 compute-0 sudo[163415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsrqmijxdsfzlzustshlogfdyqsovlso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455524.0006406-341-133101025088144/AnsiballZ_stat.py'
Jan 26 19:25:24 compute-0 sudo[163415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:24 compute-0 python3.9[163417]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:25:24 compute-0 sudo[163415]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:25 compute-0 sudo[163538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riuzxsoisohaftqiqhczgugfltvfxzbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455524.0006406-341-133101025088144/AnsiballZ_copy.py'
Jan 26 19:25:25 compute-0 sudo[163538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:25 compute-0 python3.9[163540]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455524.0006406-341-133101025088144/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:25 compute-0 sudo[163538]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:26 compute-0 sudo[163690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrflkusrsmfdvvnntknkccvwcixujkxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455525.6482613-373-89613547584754/AnsiballZ_lineinfile.py'
Jan 26 19:25:26 compute-0 sudo[163690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:26 compute-0 python3.9[163692]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:26 compute-0 sudo[163690]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:27 compute-0 sudo[163842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrghhzkfolwhhiglxqrmysairhrzfcta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455526.5311632-389-184452913877177/AnsiballZ_systemd.py'
Jan 26 19:25:27 compute-0 sudo[163842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:27 compute-0 python3.9[163844]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:25:27 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 26 19:25:27 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 26 19:25:27 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 26 19:25:27 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 26 19:25:27 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 26 19:25:27 compute-0 sudo[163842]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:28 compute-0 sudo[163998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmfpximqmyszmynxopnkywmmrhoxevuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455527.9165783-405-99722773963752/AnsiballZ_command.py'
Jan 26 19:25:28 compute-0 sudo[163998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:28 compute-0 python3.9[164000]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:25:28 compute-0 sudo[163998]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:29 compute-0 sudo[164162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqjxmhthlvpgoihyggubypybahjohneh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455528.8943746-425-271393163010720/AnsiballZ_stat.py'
Jan 26 19:25:29 compute-0 sudo[164162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:29 compute-0 podman[164125]: 2026-01-26 19:25:29.361367399 +0000 UTC m=+0.099546471 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 19:25:29 compute-0 python3.9[164164]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:25:29 compute-0 sudo[164162]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:30 compute-0 sudo[164320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fztmmkquvhtzsegmftuknszoxelmssey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455529.8703036-443-252180881379166/AnsiballZ_stat.py'
Jan 26 19:25:30 compute-0 sudo[164320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:30 compute-0 python3.9[164322]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:25:30 compute-0 sudo[164320]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:31 compute-0 sudo[164443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acoyxudarvacgwtubnjmhdjiiipjtcbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455529.8703036-443-252180881379166/AnsiballZ_copy.py'
Jan 26 19:25:31 compute-0 sudo[164443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:31 compute-0 python3.9[164445]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455529.8703036-443-252180881379166/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:31 compute-0 sudo[164443]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:32 compute-0 sudo[164595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijoluvfkkyxzbgyhisstxunnfqwobsiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455531.6885486-473-251320281537099/AnsiballZ_command.py'
Jan 26 19:25:32 compute-0 sudo[164595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:32 compute-0 python3.9[164597]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:25:32 compute-0 sudo[164595]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:32 compute-0 sudo[164748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcjrcjdgfblbaeylduwncswjhivjmsag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455532.5484192-489-138878310840501/AnsiballZ_lineinfile.py'
Jan 26 19:25:32 compute-0 sudo[164748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:33 compute-0 python3.9[164750]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:33 compute-0 sudo[164748]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:33 compute-0 sudo[164900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwtybbotywjcfbncwxqymxugwhwcozjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455533.3999257-505-140925963243093/AnsiballZ_replace.py'
Jan 26 19:25:33 compute-0 sudo[164900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:34 compute-0 python3.9[164902]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:34 compute-0 sudo[164900]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:34 compute-0 sudo[165052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kolrdvjbrqlyvjcpzuwmxybdrlhvqeqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455534.2653387-521-102816179973297/AnsiballZ_replace.py'
Jan 26 19:25:34 compute-0 sudo[165052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:34 compute-0 python3.9[165054]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:34 compute-0 sudo[165052]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:35 compute-0 sudo[165204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-runlhfeaeqwvljrgbdawqmwyhdbyimoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455535.1395628-539-71628645595146/AnsiballZ_lineinfile.py'
Jan 26 19:25:35 compute-0 sudo[165204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:35 compute-0 python3.9[165206]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:35 compute-0 sudo[165204]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:36 compute-0 sudo[165356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sltggmfjsxdqrqbmuatkqglepeoofkil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455535.8392062-539-214686278167283/AnsiballZ_lineinfile.py'
Jan 26 19:25:36 compute-0 sudo[165356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:36 compute-0 python3.9[165358]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:36 compute-0 sudo[165356]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:36 compute-0 sudo[165508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfbbuxosquroxzslnsfihoxpzqsmckgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455536.5667403-539-187225291631368/AnsiballZ_lineinfile.py'
Jan 26 19:25:36 compute-0 sudo[165508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:37 compute-0 python3.9[165510]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:37 compute-0 sudo[165508]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:37 compute-0 sudo[165660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yafzceqbouvveejoaahqwtmitslhmamw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455537.194056-539-141984942087083/AnsiballZ_lineinfile.py'
Jan 26 19:25:37 compute-0 sudo[165660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:37 compute-0 python3.9[165662]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:37 compute-0 sudo[165660]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:38 compute-0 sudo[165812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbaiaktjusachflkbkokfviebwhcviga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455537.9636421-597-91448110630059/AnsiballZ_stat.py'
Jan 26 19:25:38 compute-0 sudo[165812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:38 compute-0 python3.9[165814]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:25:38 compute-0 sudo[165812]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:39 compute-0 sudo[165966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcqnvreqkxecgoahtcwalogepdadaabg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455538.828557-613-54625088179462/AnsiballZ_command.py'
Jan 26 19:25:39 compute-0 sudo[165966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:39 compute-0 python3.9[165968]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:25:39 compute-0 sudo[165966]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:40 compute-0 sudo[166119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypwohdacuionxryquijygbbhavqhsfno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455539.6541946-631-219922200760470/AnsiballZ_systemd_service.py'
Jan 26 19:25:40 compute-0 sudo[166119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:40 compute-0 python3.9[166121]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:25:41 compute-0 systemd[1]: Listening on multipathd control socket.
Jan 26 19:25:41 compute-0 sudo[166119]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:42 compute-0 sudo[166275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnmaoxfxkevjtrpykrnrmmxkyqmnjnfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455541.7131271-647-52734345080308/AnsiballZ_systemd_service.py'
Jan 26 19:25:42 compute-0 sudo[166275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:42 compute-0 python3.9[166277]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:25:42 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 26 19:25:42 compute-0 udevadm[166282]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 26 19:25:42 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 26 19:25:42 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 26 19:25:42 compute-0 multipathd[166286]: --------start up--------
Jan 26 19:25:42 compute-0 multipathd[166286]: read /etc/multipath.conf
Jan 26 19:25:42 compute-0 multipathd[166286]: path checkers start up
Jan 26 19:25:42 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 26 19:25:42 compute-0 sudo[166275]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:43 compute-0 sudo[166443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxngvbetixqnuqsbyalsdvqzaojnosel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455543.1164484-671-131985581273388/AnsiballZ_file.py'
Jan 26 19:25:43 compute-0 sudo[166443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:43 compute-0 python3.9[166445]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 26 19:25:43 compute-0 sudo[166443]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:44 compute-0 sudo[166595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbfgmmezdspmeavlhummskjrqivombcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455543.9208264-687-21908559416808/AnsiballZ_modprobe.py'
Jan 26 19:25:44 compute-0 sudo[166595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:44 compute-0 python3.9[166597]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 26 19:25:44 compute-0 kernel: Key type psk registered
Jan 26 19:25:44 compute-0 sudo[166595]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:45 compute-0 sudo[166758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnqoxqkwkwbyxpkihcecoiizozuwmlan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455544.908359-703-257377674932412/AnsiballZ_stat.py'
Jan 26 19:25:45 compute-0 sudo[166758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:45 compute-0 python3.9[166760]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:25:45 compute-0 sudo[166758]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:46 compute-0 sudo[166881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjvemfngkeoxsguodzryoamnrujtovey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455544.908359-703-257377674932412/AnsiballZ_copy.py'
Jan 26 19:25:46 compute-0 sudo[166881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:46 compute-0 python3.9[166883]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455544.908359-703-257377674932412/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:46 compute-0 sudo[166881]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:46 compute-0 sudo[167033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjtjqdhmrersblhuplmlumyamasdwqwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455546.602386-735-109842362271046/AnsiballZ_lineinfile.py'
Jan 26 19:25:46 compute-0 sudo[167033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:47 compute-0 python3.9[167035]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:47 compute-0 sudo[167033]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:47 compute-0 sudo[167185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bagveknqrgrlbyzzouilgxzxltctzgql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455547.4383717-751-204787898602606/AnsiballZ_systemd.py'
Jan 26 19:25:47 compute-0 sudo[167185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:48 compute-0 python3.9[167187]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:25:48 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 26 19:25:48 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 26 19:25:48 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 26 19:25:48 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 26 19:25:48 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 26 19:25:48 compute-0 sudo[167185]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:48 compute-0 sudo[167341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbqsgqjccvexbbeyytbiynoogdcellku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455548.6211374-767-178823527640140/AnsiballZ_dnf.py'
Jan 26 19:25:48 compute-0 sudo[167341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:49 compute-0 python3.9[167343]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 26 19:25:51 compute-0 systemd[1]: Reloading.
Jan 26 19:25:51 compute-0 systemd-rc-local-generator[167373]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:25:51 compute-0 systemd-sysv-generator[167377]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:25:51 compute-0 systemd[1]: Reloading.
Jan 26 19:25:51 compute-0 systemd-rc-local-generator[167409]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:25:51 compute-0 systemd-sysv-generator[167412]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:25:52 compute-0 systemd-logind[794]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 26 19:25:52 compute-0 systemd-logind[794]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 26 19:25:52 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 26 19:25:52 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 26 19:25:52 compute-0 systemd[1]: Reloading.
Jan 26 19:25:52 compute-0 systemd-rc-local-generator[167505]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:25:52 compute-0 systemd-sysv-generator[167510]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:25:52 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 26 19:25:53 compute-0 sudo[167341]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:53 compute-0 sudo[168695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkudnpfgqiqvaiemuksupjzizqvzzvaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455553.6191125-783-83391052285633/AnsiballZ_systemd_service.py'
Jan 26 19:25:53 compute-0 sudo[168695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:54 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 26 19:25:54 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 26 19:25:54 compute-0 python3.9[168724]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:25:54 compute-0 systemd[1]: man-db-cache-update.service: Consumed 2.176s CPU time.
Jan 26 19:25:54 compute-0 systemd[1]: run-rbe8694d1b39e4a94885b16f86b7b4386.service: Deactivated successfully.
Jan 26 19:25:54 compute-0 systemd[1]: Stopping Open-iSCSI...
Jan 26 19:25:54 compute-0 iscsid[162321]: iscsid shutting down.
Jan 26 19:25:54 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Jan 26 19:25:54 compute-0 systemd[1]: Stopped Open-iSCSI.
Jan 26 19:25:54 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 26 19:25:54 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 26 19:25:54 compute-0 systemd[1]: Started Open-iSCSI.
Jan 26 19:25:54 compute-0 sudo[168695]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:54 compute-0 sudo[168976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izqyijvfwmlscsrotpjoyoexfjjqkndq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455554.5871067-799-229574110099585/AnsiballZ_systemd_service.py'
Jan 26 19:25:54 compute-0 sudo[168976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:55 compute-0 podman[168939]: 2026-01-26 19:25:55.073954598 +0000 UTC m=+0.156301165 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:25:55 compute-0 python3.9[168984]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:25:55 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 26 19:25:55 compute-0 multipathd[166286]: exit (signal)
Jan 26 19:25:55 compute-0 multipathd[166286]: --------shut down-------
Jan 26 19:25:55 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Jan 26 19:25:55 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 26 19:25:55 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 26 19:25:55 compute-0 multipathd[169000]: --------start up--------
Jan 26 19:25:55 compute-0 multipathd[169000]: read /etc/multipath.conf
Jan 26 19:25:55 compute-0 multipathd[169000]: path checkers start up
Jan 26 19:25:55 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 26 19:25:55 compute-0 sudo[168976]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:56 compute-0 python3.9[169157]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:25:57 compute-0 sudo[169311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfgqjsbzluxzdvcocftgeafydggfcpuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455557.141134-834-248267639957292/AnsiballZ_file.py'
Jan 26 19:25:57 compute-0 sudo[169311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:57 compute-0 python3.9[169313]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:25:57 compute-0 sudo[169311]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:58 compute-0 sudo[169463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvnfrlzbrvvqqbahshisbctrhosgmakn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455558.220179-856-172134103957596/AnsiballZ_systemd_service.py'
Jan 26 19:25:58 compute-0 sudo[169463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:25:58 compute-0 python3.9[169465]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 19:25:58 compute-0 systemd[1]: Reloading.
Jan 26 19:25:59 compute-0 systemd-rc-local-generator[169492]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:25:59 compute-0 systemd-sysv-generator[169495]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:25:59 compute-0 sudo[169463]: pam_unix(sudo:session): session closed for user root
Jan 26 19:25:59 compute-0 sshd-session[169525]: Invalid user loginuser from 193.32.162.151 port 38678
Jan 26 19:25:59 compute-0 podman[169626]: 2026-01-26 19:25:59.808737054 +0000 UTC m=+0.068063081 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:25:59 compute-0 sshd-session[169525]: Connection closed by invalid user loginuser 193.32.162.151 port 38678 [preauth]
Jan 26 19:25:59 compute-0 python3.9[169662]: ansible-ansible.builtin.service_facts Invoked
Jan 26 19:26:00 compute-0 network[169688]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 19:26:00 compute-0 network[169689]: 'network-scripts' will be removed from distribution in near future.
Jan 26 19:26:00 compute-0 network[169690]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 19:26:04 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 26 19:26:06 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 19:26:06 compute-0 sudo[169962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vldeyilcdpkzhsckcuegdepmlwfftqzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455566.4088786-894-75108442821296/AnsiballZ_systemd_service.py'
Jan 26 19:26:06 compute-0 sudo[169962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:07 compute-0 python3.9[169964]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:26:07 compute-0 sudo[169962]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:07 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 26 19:26:07 compute-0 sudo[170116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbnhaqnksogsmifncvhinfkliqpjxqsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455567.3357835-894-170786338366236/AnsiballZ_systemd_service.py'
Jan 26 19:26:07 compute-0 sudo[170116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:08 compute-0 python3.9[170118]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:26:08 compute-0 sudo[170116]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:08 compute-0 sudo[170269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agnwahqewvfeyktqhpkfkjtvrlmstbbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455568.274838-894-161113437755507/AnsiballZ_systemd_service.py'
Jan 26 19:26:08 compute-0 sudo[170269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:08 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 26 19:26:09 compute-0 python3.9[170271]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:26:09 compute-0 sudo[170269]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:09 compute-0 sudo[170423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lccnrskusnkjpqkmuhobvpwralqlocnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455569.2284586-894-241844219368735/AnsiballZ_systemd_service.py'
Jan 26 19:26:09 compute-0 sudo[170423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:09 compute-0 python3.9[170425]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:26:09 compute-0 sudo[170423]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:10 compute-0 sudo[170576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phgrdevjokfotvaeqxzuwnmbiegrcaoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455570.1641057-894-207268822267587/AnsiballZ_systemd_service.py'
Jan 26 19:26:10 compute-0 sudo[170576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:10 compute-0 python3.9[170578]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:26:10 compute-0 sudo[170576]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:11 compute-0 sudo[170729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzipftgfpvituyjijguzwbxwseahgmmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455571.1460223-894-58874085191291/AnsiballZ_systemd_service.py'
Jan 26 19:26:11 compute-0 sudo[170729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:11 compute-0 python3.9[170731]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:26:11 compute-0 sudo[170729]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:12 compute-0 sudo[170882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqgkjulekhoxgxguamazrxswjzfdtapa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455571.9431112-894-1033987305582/AnsiballZ_systemd_service.py'
Jan 26 19:26:12 compute-0 sudo[170882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:12 compute-0 python3.9[170884]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:26:12 compute-0 sudo[170882]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:13 compute-0 sudo[171035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acjylprzosserwfzhwuxxppuleomlmax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455572.7904572-894-278744445868415/AnsiballZ_systemd_service.py'
Jan 26 19:26:13 compute-0 sudo[171035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:13 compute-0 python3.9[171037]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:26:13 compute-0 sudo[171035]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:14 compute-0 sudo[171188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwarwwvwynrdhfyubpppeqcqwfzdxufj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455573.9309757-1012-152207054873725/AnsiballZ_file.py'
Jan 26 19:26:14 compute-0 sudo[171188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:14 compute-0 python3.9[171190]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:14 compute-0 sudo[171188]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:15 compute-0 sudo[171340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxphnyywuflmmxjomcsupfkkjocxmnar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455574.7238414-1012-70451344049997/AnsiballZ_file.py'
Jan 26 19:26:15 compute-0 sudo[171340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:15 compute-0 python3.9[171342]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:15 compute-0 sudo[171340]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:15 compute-0 sudo[171492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eplbbpwcqodnqtwxidwkryopzoymmncx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455575.560235-1012-153087080307156/AnsiballZ_file.py'
Jan 26 19:26:15 compute-0 sudo[171492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:16 compute-0 python3.9[171494]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:16 compute-0 sudo[171492]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:16 compute-0 sudo[171644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-houfsqcztmjfcltwamzroqaizgeqesbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455576.3151872-1012-151086922885602/AnsiballZ_file.py'
Jan 26 19:26:16 compute-0 sudo[171644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:16 compute-0 python3.9[171646]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:16 compute-0 sudo[171644]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:17 compute-0 sudo[171796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irabvlwyeqhcdtoekwutihdtidziiagj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455577.0615242-1012-264715656083109/AnsiballZ_file.py'
Jan 26 19:26:17 compute-0 sudo[171796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:17 compute-0 python3.9[171798]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:17 compute-0 sudo[171796]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:18 compute-0 sudo[171948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpgutvjzyurtmkztiqwkgslckitmybvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455577.7474334-1012-48501248815003/AnsiballZ_file.py'
Jan 26 19:26:18 compute-0 sudo[171948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:18 compute-0 python3.9[171950]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:18 compute-0 sudo[171948]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:18 compute-0 sudo[172100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxmqrabwqskefozklrzmzecstpulyoez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455578.5750608-1012-122638780959980/AnsiballZ_file.py'
Jan 26 19:26:18 compute-0 sudo[172100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:19 compute-0 python3.9[172102]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:19 compute-0 sudo[172100]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:19 compute-0 sudo[172252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxpcnvowmjhthqtkmeutgfghipevnwvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455579.386283-1012-170891595724766/AnsiballZ_file.py'
Jan 26 19:26:19 compute-0 sudo[172252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:19 compute-0 python3.9[172254]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:20 compute-0 sudo[172252]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:20 compute-0 sudo[172404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dumvakxdgszkfcisjbgshhrlimwegsqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455580.252658-1126-54040273515116/AnsiballZ_file.py'
Jan 26 19:26:20 compute-0 sudo[172404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:20 compute-0 python3.9[172406]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:20 compute-0 sudo[172404]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:21 compute-0 sudo[172556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxlhpidnifrfmdmsgweeatmiywyfbibb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455581.1006181-1126-166218461293929/AnsiballZ_file.py'
Jan 26 19:26:21 compute-0 sudo[172556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:21 compute-0 python3.9[172558]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:21 compute-0 sudo[172556]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:22 compute-0 sudo[172708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haoouaezgxjlloschtevskvxbzoljerz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455581.8302467-1126-1772084320250/AnsiballZ_file.py'
Jan 26 19:26:22 compute-0 sudo[172708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:22 compute-0 python3.9[172710]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:22 compute-0 sudo[172708]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:23 compute-0 sudo[172860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tefzzbbhaerrmfcgidtooahxlgzntzhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455582.643064-1126-269672151637391/AnsiballZ_file.py'
Jan 26 19:26:23 compute-0 sudo[172860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:23 compute-0 python3.9[172862]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:23 compute-0 sudo[172860]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:23 compute-0 sudo[173012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kveewcidhlnabjwmcihvznlxhilubgzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455583.4673269-1126-173095372814962/AnsiballZ_file.py'
Jan 26 19:26:23 compute-0 sudo[173012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:26:24.001 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:26:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:26:24.002 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:26:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:26:24.002 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:26:24 compute-0 python3.9[173014]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:24 compute-0 sudo[173012]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:24 compute-0 sudo[173165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vglmcbfnwqpmjcsomeilvjxhfgpzpcqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455584.2340286-1126-266770183257333/AnsiballZ_file.py'
Jan 26 19:26:24 compute-0 sudo[173165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:24 compute-0 python3.9[173167]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:24 compute-0 sudo[173165]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:25 compute-0 podman[173267]: 2026-01-26 19:26:25.399288083 +0000 UTC m=+0.136518202 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:26:25 compute-0 sudo[173343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndlacmeogyjjjfycfuuxqahfglqskrtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455585.038751-1126-228254396760058/AnsiballZ_file.py'
Jan 26 19:26:25 compute-0 sudo[173343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:25 compute-0 python3.9[173345]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:25 compute-0 sudo[173343]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:26 compute-0 sudo[173495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yerlfksuilsaaxhszaenfimqandornla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455585.8221142-1126-58147837540743/AnsiballZ_file.py'
Jan 26 19:26:26 compute-0 sudo[173495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:26 compute-0 python3.9[173497]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:26:26 compute-0 sudo[173495]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:27 compute-0 sudo[173647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohhlaptofkahnkkjxksmkxwdvedwoviq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455586.828987-1242-260969204901961/AnsiballZ_command.py'
Jan 26 19:26:27 compute-0 sudo[173647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:27 compute-0 python3.9[173649]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:26:27 compute-0 sudo[173647]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:28 compute-0 python3.9[173801]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 19:26:29 compute-0 sudo[173951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnqksqvlwknmbghosbevtajiooeqjiao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455588.8086255-1278-272065789106958/AnsiballZ_systemd_service.py'
Jan 26 19:26:29 compute-0 sudo[173951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:29 compute-0 python3.9[173953]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 19:26:29 compute-0 systemd[1]: Reloading.
Jan 26 19:26:29 compute-0 systemd-rc-local-generator[173980]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:26:29 compute-0 systemd-sysv-generator[173983]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:26:29 compute-0 sudo[173951]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:30 compute-0 podman[173988]: 2026-01-26 19:26:30.031950783 +0000 UTC m=+0.085305520 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 19:26:30 compute-0 sudo[174156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqdpqyikixziuyqirzfkaxtmpjpidpvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455590.2076342-1294-270502122342239/AnsiballZ_command.py'
Jan 26 19:26:30 compute-0 sudo[174156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:30 compute-0 python3.9[174158]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:26:30 compute-0 sudo[174156]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:31 compute-0 sudo[174309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjseqzhwwygozdqfcjouhgpsfvranuck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455591.0310743-1294-120993168678521/AnsiballZ_command.py'
Jan 26 19:26:31 compute-0 sudo[174309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:31 compute-0 python3.9[174311]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:26:31 compute-0 sudo[174309]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:32 compute-0 sudo[174462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqlgtcwdmwmhtaqhemyjptrrkybagqoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455591.7599487-1294-15644004687853/AnsiballZ_command.py'
Jan 26 19:26:32 compute-0 sudo[174462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:32 compute-0 python3.9[174464]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:26:32 compute-0 sudo[174462]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:32 compute-0 sudo[174615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtlvviicxkburzmonmdzpiklqskjzldy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455592.5488415-1294-101394046712772/AnsiballZ_command.py'
Jan 26 19:26:32 compute-0 sudo[174615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:33 compute-0 python3.9[174617]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:26:33 compute-0 sudo[174615]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:33 compute-0 sudo[174768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryouusdxdalxtcpwctfsdtdagzdmqrdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455593.3248732-1294-182017564913336/AnsiballZ_command.py'
Jan 26 19:26:33 compute-0 sudo[174768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:33 compute-0 python3.9[174770]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:26:33 compute-0 sudo[174768]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:34 compute-0 sudo[174921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkutmuoforhaeehzvxnkogmcrixaloiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455594.1131957-1294-177651189548428/AnsiballZ_command.py'
Jan 26 19:26:34 compute-0 sudo[174921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:34 compute-0 python3.9[174923]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:26:34 compute-0 sudo[174921]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:35 compute-0 sudo[175074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzghmnzcjxuuvzkbfmtxizkesbfxdzph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455594.9114184-1294-203984047835198/AnsiballZ_command.py'
Jan 26 19:26:35 compute-0 sudo[175074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:35 compute-0 python3.9[175076]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:26:35 compute-0 sudo[175074]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:36 compute-0 sudo[175227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjuflyerovwowtyqflxxokbasogpmaee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455595.913899-1294-50722695125924/AnsiballZ_command.py'
Jan 26 19:26:36 compute-0 sudo[175227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:36 compute-0 python3.9[175229]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:26:36 compute-0 sudo[175227]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:38 compute-0 sudo[175380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gibmcbipfukaumszrkfrmbgueissbojm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455597.6278086-1437-222975159599972/AnsiballZ_file.py'
Jan 26 19:26:38 compute-0 sudo[175380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:38 compute-0 python3.9[175382]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:26:38 compute-0 sudo[175380]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:38 compute-0 sudo[175532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndoycyfrzdhltrmfhyhluldwgbxzsnha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455598.481987-1437-97921573847296/AnsiballZ_file.py'
Jan 26 19:26:38 compute-0 sudo[175532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:39 compute-0 python3.9[175534]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:26:39 compute-0 sudo[175532]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:39 compute-0 sudo[175684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpxiiiuhnypvqcybnygktfpqzukbcsyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455599.3338184-1437-270551271176800/AnsiballZ_file.py'
Jan 26 19:26:39 compute-0 sudo[175684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:40 compute-0 python3.9[175686]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:26:40 compute-0 sudo[175684]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:40 compute-0 sudo[175836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxsegkbwdqawjvgsysnninznsmrmvlam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455600.2321973-1481-54109786683879/AnsiballZ_file.py'
Jan 26 19:26:40 compute-0 sudo[175836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:40 compute-0 python3.9[175838]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:26:40 compute-0 sudo[175836]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:41 compute-0 sudo[175988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhoekfpgmafpwpnewykwjnlobysstsru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455601.0323157-1481-23953609130451/AnsiballZ_file.py'
Jan 26 19:26:41 compute-0 sudo[175988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:41 compute-0 python3.9[175990]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:26:41 compute-0 sudo[175988]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:42 compute-0 sudo[176140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjorjdwlfmlnfxuazuadkhqdwwntxegf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455601.8928165-1481-169038682604268/AnsiballZ_file.py'
Jan 26 19:26:42 compute-0 sudo[176140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:42 compute-0 python3.9[176142]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:26:42 compute-0 sudo[176140]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:42 compute-0 sudo[176292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvcsdhulwjbcsakuvzuitfaveogbvkux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455602.6144407-1481-98062399513675/AnsiballZ_file.py'
Jan 26 19:26:42 compute-0 sudo[176292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:43 compute-0 python3.9[176294]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:26:43 compute-0 sudo[176292]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:43 compute-0 sudo[176444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lukatpwxkikbrygpzxmvppxjhjbyecft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455603.3686671-1481-161824908338033/AnsiballZ_file.py'
Jan 26 19:26:43 compute-0 sudo[176444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:43 compute-0 python3.9[176446]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:26:43 compute-0 sudo[176444]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:44 compute-0 sudo[176596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyvwqekcgbrqtmwaukrwqoirniwfalgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455604.1712816-1481-7814678623803/AnsiballZ_file.py'
Jan 26 19:26:44 compute-0 sudo[176596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:44 compute-0 python3.9[176598]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:26:44 compute-0 sudo[176596]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:45 compute-0 sudo[176748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqnolpqqfeokoswngmrfleydfqvgogna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455605.0050004-1481-59943387421046/AnsiballZ_file.py'
Jan 26 19:26:45 compute-0 sudo[176748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:45 compute-0 python3.9[176750]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:26:45 compute-0 sudo[176748]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:50 compute-0 sudo[176900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzwjlsylvtspsifmclghyudxkptmqmot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455610.269829-1718-152824223408392/AnsiballZ_getent.py'
Jan 26 19:26:50 compute-0 sudo[176900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:51 compute-0 python3.9[176902]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 26 19:26:51 compute-0 sudo[176900]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:51 compute-0 sudo[177053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqhcwzboozijzbzqdgczlvzltoabhwuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455611.3665493-1734-1951590240475/AnsiballZ_group.py'
Jan 26 19:26:51 compute-0 sudo[177053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:52 compute-0 python3.9[177055]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 19:26:52 compute-0 groupadd[177056]: group added to /etc/group: name=nova, GID=42436
Jan 26 19:26:52 compute-0 groupadd[177056]: group added to /etc/gshadow: name=nova
Jan 26 19:26:52 compute-0 groupadd[177056]: new group: name=nova, GID=42436
Jan 26 19:26:52 compute-0 sudo[177053]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:53 compute-0 sudo[177211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzokihgtodqcryxxnujfhacvhlfzevki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455612.493514-1750-192306652512299/AnsiballZ_user.py'
Jan 26 19:26:53 compute-0 sudo[177211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:26:53 compute-0 python3.9[177213]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 19:26:53 compute-0 useradd[177215]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 26 19:26:53 compute-0 useradd[177215]: add 'nova' to group 'libvirt'
Jan 26 19:26:53 compute-0 useradd[177215]: add 'nova' to shadow group 'libvirt'
Jan 26 19:26:53 compute-0 sudo[177211]: pam_unix(sudo:session): session closed for user root
Jan 26 19:26:54 compute-0 sshd-session[177246]: Accepted publickey for zuul from 192.168.122.30 port 37968 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:26:54 compute-0 systemd-logind[794]: New session 25 of user zuul.
Jan 26 19:26:54 compute-0 systemd[1]: Started Session 25 of User zuul.
Jan 26 19:26:54 compute-0 sshd-session[177246]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:26:54 compute-0 sshd-session[177249]: Received disconnect from 192.168.122.30 port 37968:11: disconnected by user
Jan 26 19:26:54 compute-0 sshd-session[177249]: Disconnected from user zuul 192.168.122.30 port 37968
Jan 26 19:26:54 compute-0 sshd-session[177246]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:26:54 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Jan 26 19:26:54 compute-0 systemd-logind[794]: Session 25 logged out. Waiting for processes to exit.
Jan 26 19:26:54 compute-0 systemd-logind[794]: Removed session 25.
Jan 26 19:26:55 compute-0 python3.9[177399]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:26:56 compute-0 podman[177494]: 2026-01-26 19:26:56.038480976 +0000 UTC m=+0.136556094 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 19:26:56 compute-0 python3.9[177533]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455614.9639692-1800-252203035231498/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:26:56 compute-0 python3.9[177696]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:26:57 compute-0 python3.9[177772]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:26:58 compute-0 python3.9[177922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:26:58 compute-0 python3.9[178043]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455617.6245663-1800-116375912179643/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:26:59 compute-0 python3.9[178193]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:27:00 compute-0 podman[178288]: 2026-01-26 19:27:00.25135168 +0000 UTC m=+0.072736096 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:27:00 compute-0 python3.9[178329]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455619.1510382-1800-16927582369952/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:27:01 compute-0 python3.9[178481]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:27:01 compute-0 python3.9[178602]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455620.6420226-1800-176043708000254/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:27:02 compute-0 python3.9[178752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:27:03 compute-0 python3.9[178873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455621.9757276-1800-15167507660335/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:27:03 compute-0 sudo[179023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyjezyzqmyhtwmvorlksedpzoozaaolv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455623.3620074-1966-162944452433966/AnsiballZ_file.py'
Jan 26 19:27:03 compute-0 sudo[179023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:03 compute-0 python3.9[179025]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:27:03 compute-0 sudo[179023]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:04 compute-0 sudo[179175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeqgdfmdjwfdoqpwmgshzhwvhspmsbss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455624.2026298-1982-154406726151650/AnsiballZ_copy.py'
Jan 26 19:27:04 compute-0 sudo[179175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:04 compute-0 python3.9[179177]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:27:04 compute-0 sudo[179175]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:05 compute-0 sudo[179327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwsdcppkclcbbjzdlmpfklumoedgsmwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455625.0284624-1998-2449117206790/AnsiballZ_stat.py'
Jan 26 19:27:05 compute-0 sudo[179327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:05 compute-0 python3.9[179329]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:27:05 compute-0 sudo[179327]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:06 compute-0 sudo[179479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmzawqyludxmfshagfyejlvxlefwhuot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455625.7914538-2014-251749697961551/AnsiballZ_stat.py'
Jan 26 19:27:06 compute-0 sudo[179479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:06 compute-0 python3.9[179481]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:27:06 compute-0 sudo[179479]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:06 compute-0 sudo[179602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnywnbwkrltcxikwhkigjhfcdozqsfjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455625.7914538-2014-251749697961551/AnsiballZ_copy.py'
Jan 26 19:27:06 compute-0 sudo[179602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:07 compute-0 python3.9[179604]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769455625.7914538-2014-251749697961551/.source _original_basename=.dx1265hq follow=False checksum=634c4fe6c51f3aec810a9a9c0958719d21819d9d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 26 19:27:07 compute-0 sudo[179602]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:07 compute-0 python3.9[179756]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:27:08 compute-0 python3.9[179908]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:27:09 compute-0 python3.9[180029]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455628.212251-2066-153140196256970/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=7e468c4bb818176cdee8012ff62cf84889842d54 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:27:10 compute-0 python3.9[180179]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:27:10 compute-0 python3.9[180300]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455629.5275116-2096-16090247474079/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=8bdc7656189bf4cc1953b13b14a361a9ef9093a6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:27:11 compute-0 sudo[180450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttvkgbzzefbgixglheyaqspcvpmscpcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455631.1600394-2130-218303644891643/AnsiballZ_container_config_data.py'
Jan 26 19:27:11 compute-0 sudo[180450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:11 compute-0 python3.9[180452]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 26 19:27:11 compute-0 sudo[180450]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:12 compute-0 sudo[180602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsgiclovufbzdfaszkyurapknygfetli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455632.4389045-2152-144633241716319/AnsiballZ_container_config_hash.py'
Jan 26 19:27:12 compute-0 sudo[180602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:13 compute-0 python3.9[180604]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 19:27:13 compute-0 sudo[180602]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:14 compute-0 sudo[180754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxolzvhkahznfcythsxzqbqxusdzcaue ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769455633.5208385-2172-3676733764109/AnsiballZ_edpm_container_manage.py'
Jan 26 19:27:14 compute-0 sudo[180754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:14 compute-0 python3[180756]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 19:27:14 compute-0 podman[180792]: 2026-01-26 19:27:14.568195283 +0000 UTC m=+0.075280104 container create a07fba89c8e06a7f53a1f08574757a8a03ba3fe6b7a2888f7d378fe7c6874bf0 (image=38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20260120, tcib_managed=true, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 10 Base Image, container_name=nova_compute_init)
Jan 26 19:27:14 compute-0 podman[180792]: 2026-01-26 19:27:14.53049682 +0000 UTC m=+0.037581701 image pull 00a1d0493134435a0b50f81676478a7bc2e0126d0e30cb65072b8884b766f13f 38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Jan 26 19:27:14 compute-0 python3[180756]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 26 19:27:14 compute-0 sudo[180754]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:15 compute-0 sudo[180980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iepmtxbdgbjetvvlexcrmiokzfoivbzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455635.0010066-2188-11575869695763/AnsiballZ_stat.py'
Jan 26 19:27:15 compute-0 sudo[180980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:15 compute-0 python3.9[180982]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:27:15 compute-0 sudo[180980]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:16 compute-0 sudo[181134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwzrhlddocrjnbsgpixquigmxgxjiiwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455636.2435684-2212-12412966416508/AnsiballZ_container_config_data.py'
Jan 26 19:27:16 compute-0 sudo[181134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:16 compute-0 python3.9[181136]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 26 19:27:16 compute-0 sudo[181134]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:17 compute-0 sudo[181286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoirwjxslgumkblifmvwbxdwvhejnobh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455637.2538238-2234-101811026426395/AnsiballZ_container_config_hash.py'
Jan 26 19:27:17 compute-0 sudo[181286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:17 compute-0 python3.9[181288]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 19:27:17 compute-0 sudo[181286]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:18 compute-0 sudo[181438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcfermwgsxtnszqqeemnlkvctlunvrsc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769455638.2331488-2254-187841199325956/AnsiballZ_edpm_container_manage.py'
Jan 26 19:27:18 compute-0 sudo[181438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:18 compute-0 python3[181440]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 19:27:19 compute-0 podman[181476]: 2026-01-26 19:27:19.119396337 +0000 UTC m=+0.058795405 container create a80736d754488063f9d59348212f5ce64b0e96dd00e26a6328b7decaab624f52 (image=38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=edpm, config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0)
Jan 26 19:27:19 compute-0 podman[181476]: 2026-01-26 19:27:19.094277799 +0000 UTC m=+0.033676887 image pull 00a1d0493134435a0b50f81676478a7bc2e0126d0e30cb65072b8884b766f13f 38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Jan 26 19:27:19 compute-0 python3[181440]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest kolla_start
Jan 26 19:27:19 compute-0 sudo[181438]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:19 compute-0 sudo[181664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmzkqulijepxzxonopbtckdvmvjzoqjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455639.6227932-2270-177819416839924/AnsiballZ_stat.py'
Jan 26 19:27:19 compute-0 sudo[181664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:20 compute-0 python3.9[181666]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:27:20 compute-0 sudo[181664]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:20 compute-0 sudo[181818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyleqzdmmbrjtyspmsnqrdmkciftqcoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455640.5170856-2288-278399928109376/AnsiballZ_file.py'
Jan 26 19:27:20 compute-0 sudo[181818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:21 compute-0 python3.9[181820]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:27:21 compute-0 sudo[181818]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:21 compute-0 sudo[181969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qveaeqlgkbgelqqingnhlqkdfzylkzwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455641.191589-2288-161729672431523/AnsiballZ_copy.py'
Jan 26 19:27:21 compute-0 sudo[181969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:21 compute-0 python3.9[181971]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769455641.191589-2288-161729672431523/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:27:21 compute-0 sudo[181969]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:22 compute-0 sudo[182045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzmhkzhnqyyijlrgjnnefazldptqumpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455641.191589-2288-161729672431523/AnsiballZ_systemd.py'
Jan 26 19:27:22 compute-0 sudo[182045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:22 compute-0 python3.9[182047]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 19:27:22 compute-0 systemd[1]: Reloading.
Jan 26 19:27:22 compute-0 systemd-rc-local-generator[182069]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:27:22 compute-0 systemd-sysv-generator[182075]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:27:22 compute-0 sudo[182045]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:23 compute-0 sudo[182156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmanyvmacwyrtrgikxrnnccatmsmevyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455641.191589-2288-161729672431523/AnsiballZ_systemd.py'
Jan 26 19:27:23 compute-0 sudo[182156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:23 compute-0 python3.9[182158]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:27:23 compute-0 systemd[1]: Reloading.
Jan 26 19:27:23 compute-0 systemd-sysv-generator[182186]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:27:23 compute-0 systemd-rc-local-generator[182182]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:27:23 compute-0 systemd[1]: Starting nova_compute container...
Jan 26 19:27:23 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:27:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e101dee1ea6f60bfd3c28041328987a4d14d2341a735542f72277b5642ef6ed5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 26 19:27:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e101dee1ea6f60bfd3c28041328987a4d14d2341a735542f72277b5642ef6ed5/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 26 19:27:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e101dee1ea6f60bfd3c28041328987a4d14d2341a735542f72277b5642ef6ed5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 26 19:27:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e101dee1ea6f60bfd3c28041328987a4d14d2341a735542f72277b5642ef6ed5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 19:27:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e101dee1ea6f60bfd3c28041328987a4d14d2341a735542f72277b5642ef6ed5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 26 19:27:23 compute-0 podman[182198]: 2026-01-26 19:27:23.954594674 +0000 UTC m=+0.117404434 container init a80736d754488063f9d59348212f5ce64b0e96dd00e26a6328b7decaab624f52 (image=38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 19:27:23 compute-0 podman[182198]: 2026-01-26 19:27:23.971047542 +0000 UTC m=+0.133857312 container start a80736d754488063f9d59348212f5ce64b0e96dd00e26a6328b7decaab624f52 (image=38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 19:27:23 compute-0 podman[182198]: nova_compute
Jan 26 19:27:23 compute-0 nova_compute[182213]: + sudo -E kolla_set_configs
Jan 26 19:27:23 compute-0 systemd[1]: Started nova_compute container.
Jan 26 19:27:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:27:24.003 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:27:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:27:24.004 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:27:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:27:24.004 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:27:24 compute-0 sudo[182156]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Validating config file
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Copying service configuration files
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Deleting /etc/ceph
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Creating directory /etc/ceph
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Setting permission for /etc/ceph
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Writing out command to execute
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 19:27:24 compute-0 nova_compute[182213]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 19:27:24 compute-0 nova_compute[182213]: ++ cat /run_command
Jan 26 19:27:24 compute-0 nova_compute[182213]: + CMD=nova-compute
Jan 26 19:27:24 compute-0 nova_compute[182213]: + ARGS=
Jan 26 19:27:24 compute-0 nova_compute[182213]: + sudo kolla_copy_cacerts
Jan 26 19:27:24 compute-0 nova_compute[182213]: + [[ ! -n '' ]]
Jan 26 19:27:24 compute-0 nova_compute[182213]: + . kolla_extend_start
Jan 26 19:27:24 compute-0 nova_compute[182213]: Running command: 'nova-compute'
Jan 26 19:27:24 compute-0 nova_compute[182213]: + echo 'Running command: '\''nova-compute'\'''
Jan 26 19:27:24 compute-0 nova_compute[182213]: + umask 0022
Jan 26 19:27:24 compute-0 nova_compute[182213]: + exec nova-compute
Jan 26 19:27:25 compute-0 python3.9[182375]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:27:25 compute-0 python3.9[182525]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:27:26 compute-0 nova_compute[182213]: 2026-01-26 19:27:26.168 182217 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 26 19:27:26 compute-0 nova_compute[182213]: 2026-01-26 19:27:26.169 182217 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 26 19:27:26 compute-0 nova_compute[182213]: 2026-01-26 19:27:26.169 182217 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 26 19:27:26 compute-0 nova_compute[182213]: 2026-01-26 19:27:26.169 182217 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 26 19:27:26 compute-0 nova_compute[182213]: 2026-01-26 19:27:26.371 182217 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:27:26 compute-0 podman[182552]: 2026-01-26 19:27:26.382490047 +0000 UTC m=+0.126807015 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 19:27:26 compute-0 nova_compute[182213]: 2026-01-26 19:27:26.390 182217 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:27:26 compute-0 nova_compute[182213]: 2026-01-26 19:27:26.390 182217 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Jan 26 19:27:26 compute-0 nova_compute[182213]: 2026-01-26 19:27:26.430 182217 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Jan 26 19:27:26 compute-0 nova_compute[182213]: 2026-01-26 19:27:26.432 182217 WARNING oslo_config.cfg [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Jan 26 19:27:26 compute-0 python3.9[182704]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:27:27 compute-0 nova_compute[182213]: 2026-01-26 19:27:27.569 182217 INFO nova.virt.driver [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 26 19:27:27 compute-0 nova_compute[182213]: 2026-01-26 19:27:27.669 182217 INFO nova.compute.provider_config [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.175 182217 DEBUG oslo_concurrency.lockutils [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.175 182217 DEBUG oslo_concurrency.lockutils [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.176 182217 DEBUG oslo_concurrency.lockutils [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.176 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.176 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.176 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.176 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.177 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.177 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.177 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.177 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.177 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.177 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.177 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.178 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.178 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.178 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.178 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.178 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.178 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.178 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.178 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.179 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.179 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.179 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.179 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.179 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.179 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.179 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.179 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.180 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.180 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.180 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.180 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.180 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.180 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.180 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.180 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.181 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.181 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.181 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.181 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.181 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.181 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.181 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.182 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.182 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.182 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.182 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.182 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.182 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.182 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.182 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.183 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.183 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.183 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.183 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.183 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.183 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.183 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.183 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.184 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.184 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.184 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.184 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.184 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.184 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.184 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.184 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.185 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.185 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.185 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.185 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.185 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.185 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.185 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.185 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.185 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.186 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.186 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.186 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.186 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.186 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.186 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.186 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.186 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.187 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.187 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.187 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.187 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.187 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.187 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.187 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.187 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.188 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.188 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.188 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.188 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.188 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.188 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.188 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.188 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.188 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.189 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.189 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.189 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.189 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.189 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.189 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.189 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.189 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.189 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.190 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.190 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.190 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.190 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.190 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.190 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.190 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.190 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.191 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.191 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.191 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.191 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.191 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.191 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.191 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.191 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.191 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.192 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.192 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.192 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.192 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.192 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.192 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.192 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.192 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.192 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.193 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.193 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.193 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.193 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.193 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.193 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.193 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.193 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.194 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.194 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.194 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.194 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.194 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.194 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.194 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.194 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.195 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.195 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.195 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.195 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.195 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.195 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.195 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.195 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.196 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.196 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.196 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.196 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.196 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.196 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.196 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.196 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.197 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.197 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.197 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.197 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.197 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.197 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.197 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.197 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.198 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.198 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.198 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.198 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.198 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.198 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.198 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.198 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.199 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.199 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.199 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.199 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.199 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.199 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.199 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.199 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.199 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.200 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.200 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.200 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.200 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.200 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.200 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.200 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.200 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.201 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.201 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.201 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.201 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.201 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.201 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.201 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.201 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.202 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.202 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.202 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.202 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.202 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.202 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.202 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.202 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.202 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.203 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.203 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.203 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.203 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.203 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.203 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.203 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.204 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.204 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.204 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.204 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.204 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.204 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.204 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.204 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.204 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.205 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.205 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.205 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.205 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.205 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.205 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.205 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.205 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.206 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.206 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.206 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.206 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.206 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.206 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.206 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.206 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.206 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.207 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.207 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.207 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.207 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.207 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.207 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.207 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.207 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.208 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.208 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.208 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.208 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.208 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.208 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.208 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.208 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.209 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.209 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.209 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.209 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.209 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.209 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.209 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.209 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.210 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.210 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.210 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.210 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.210 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.210 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.210 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.211 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.211 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.211 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.211 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.211 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.211 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.211 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.212 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.212 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.212 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.212 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.212 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.212 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.212 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.212 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.213 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.213 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.213 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.213 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.213 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.213 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.213 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.213 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.213 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.214 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.214 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.214 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.214 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.214 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.214 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.214 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.214 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.215 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.215 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.215 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.215 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.215 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.215 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.215 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.215 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.216 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.216 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.216 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.216 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.216 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.216 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.216 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.216 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.217 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.217 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.217 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.217 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.217 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.217 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.217 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.218 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.218 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.218 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.218 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.218 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.218 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.218 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.218 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.219 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.219 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.219 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.219 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.219 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.219 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.219 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.219 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.220 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.220 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.220 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.220 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.220 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.220 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.220 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.220 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.220 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.221 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.221 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.221 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.221 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.221 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.221 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.221 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.221 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.222 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.222 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.222 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.222 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.222 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.222 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.222 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.223 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.223 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.223 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.223 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.223 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.223 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.223 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.224 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.224 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.224 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.224 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.224 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.224 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.224 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.224 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.224 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.225 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.225 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.225 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.225 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.225 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.225 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.225 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.225 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.225 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.226 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.226 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.226 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.226 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.226 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.226 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.226 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.227 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.227 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.227 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.227 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.227 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.227 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.227 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.227 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.228 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.228 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.228 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.228 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.228 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.228 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.228 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.228 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.229 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.229 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.229 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.229 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.229 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.229 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.229 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.229 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.229 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.230 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.230 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.230 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.230 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.230 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.230 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.230 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.230 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.231 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.231 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.231 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.231 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.231 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.231 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.231 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.232 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.232 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.232 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.232 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.232 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.232 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.232 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.232 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.233 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.233 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.233 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.233 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.233 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.233 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.234 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.234 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.234 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.234 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.234 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.234 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.235 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.235 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.235 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.235 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.235 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.235 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.235 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.235 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.236 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.236 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.236 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.236 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.236 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.236 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.237 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.237 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.237 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.237 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.237 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.237 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.237 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.237 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.238 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.238 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.238 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.238 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.238 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.238 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.238 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.238 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.239 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.239 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.239 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.239 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.239 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.239 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.240 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.240 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.240 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.240 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.240 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.240 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.240 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.240 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.240 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.241 182217 WARNING oslo_config.cfg [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 26 19:27:28 compute-0 nova_compute[182213]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 26 19:27:28 compute-0 nova_compute[182213]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 26 19:27:28 compute-0 nova_compute[182213]: and ``live_migration_inbound_addr`` respectively.
Jan 26 19:27:28 compute-0 nova_compute[182213]: ).  Its value may be silently ignored in the future.
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.241 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.241 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.241 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.241 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.241 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.242 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.242 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.242 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.242 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.242 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.242 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.242 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 sudo[182856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bawdtrdwreayuewmpwiihkciukqhasjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455647.274878-2408-38954100661321/AnsiballZ_podman_container.py'
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.242 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.243 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.243 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.243 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.243 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.243 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.243 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.243 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.243 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.243 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.244 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.244 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.244 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.244 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.244 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.244 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.244 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.245 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.245 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.245 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.245 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.245 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.245 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.245 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.246 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.246 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.246 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.246 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.246 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.246 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.246 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.246 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.247 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.247 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 sudo[182856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.247 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.247 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.247 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.247 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.247 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.247 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.248 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.248 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.248 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.248 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.248 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.248 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.248 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.248 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.249 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.249 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.249 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.249 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.249 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.249 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.249 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.249 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.250 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.250 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.250 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.250 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.250 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.250 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.250 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.250 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.251 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.251 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.251 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.251 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.251 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.251 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.251 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.251 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.252 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.252 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.252 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.252 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.252 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.252 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.252 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.252 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.252 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.253 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.253 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.253 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.253 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.253 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.253 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.253 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.253 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.254 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.254 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.254 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.254 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.254 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.254 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.254 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.254 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.255 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.255 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.255 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.255 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.255 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.255 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.255 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.255 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.256 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.256 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.256 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.256 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.256 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.256 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.256 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.256 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.256 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.257 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.257 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.257 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.257 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.257 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.257 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.257 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.258 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.258 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.258 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.258 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.258 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.258 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.258 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.259 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.259 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.259 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.259 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.259 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.259 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.259 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.259 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.260 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.260 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.260 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.260 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.260 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.260 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.260 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.261 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.261 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.261 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.261 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.261 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.261 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.261 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.261 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.261 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.262 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.262 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.262 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.262 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.262 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.262 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.262 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.262 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.263 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.263 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.263 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.263 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.263 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.263 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.263 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.263 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.263 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.264 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.264 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.264 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.264 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.264 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.264 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.264 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.265 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.265 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.265 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.265 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.265 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.265 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.265 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.265 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.266 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.266 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.266 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.266 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.266 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.266 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.266 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.266 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.267 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.267 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.267 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.267 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.267 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.267 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.267 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.268 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.268 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.268 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.268 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.268 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.268 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.268 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.269 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.269 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.269 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.269 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.269 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.269 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.269 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.269 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.269 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.270 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.270 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.270 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.270 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.270 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.270 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.270 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.270 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.271 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.271 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.271 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.271 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.271 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.271 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.271 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.271 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.272 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.272 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.272 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.272 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.272 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.272 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.272 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.272 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.273 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.273 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.273 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.273 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.273 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.273 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.274 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.274 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.274 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.274 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.274 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.275 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.275 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.275 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.275 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.275 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.275 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.276 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.276 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.276 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.276 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.276 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.276 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.276 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.276 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.277 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.277 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.277 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.277 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.277 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.277 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.277 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.277 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.277 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.278 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.278 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.278 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.278 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.278 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.278 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.278 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.278 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.279 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.279 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.279 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.279 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.279 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.279 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.280 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.280 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.280 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.280 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.280 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.280 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.280 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.280 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.280 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.281 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.281 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.281 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.281 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.281 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.281 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.281 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.281 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.282 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.282 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.282 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.282 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.282 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.282 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.282 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.282 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.283 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.283 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.283 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.283 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.283 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.283 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.283 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.283 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.283 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.284 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.284 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.284 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.284 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.284 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.284 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.284 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.285 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.285 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.285 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.285 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.285 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.285 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.285 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.285 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.285 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.286 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.286 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.286 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.286 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.286 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.286 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.286 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.286 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.287 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.287 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.287 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.287 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.287 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.287 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.287 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.287 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.287 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.288 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.288 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.288 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.288 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.288 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.288 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.288 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.288 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.289 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.289 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.289 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.289 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.289 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.289 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.289 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.289 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.289 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.290 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.290 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.290 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.290 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.290 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.290 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.290 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.290 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.291 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.291 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.291 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.291 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.291 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.291 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.291 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.291 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.292 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.292 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.292 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.292 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.292 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.292 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.292 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.293 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.293 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.293 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.293 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.293 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.293 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.293 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.293 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.294 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.294 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.294 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.294 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.294 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.294 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.294 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.294 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.295 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.295 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.295 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.295 182217 DEBUG oslo_service.backend._eventlet.service [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.296 182217 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Jan 26 19:27:28 compute-0 python3.9[182858]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 26 19:27:28 compute-0 sudo[182856]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:28 compute-0 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 19:27:28 compute-0 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.804 182217 DEBUG nova.virt.libvirt.host [None req-c2efb2b0-41be-4fb3-aa0a-095575633fed - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Jan 26 19:27:28 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 26 19:27:28 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.910 182217 DEBUG nova.virt.libvirt.host [None req-c2efb2b0-41be-4fb3-aa0a-095575633fed - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f5b8938b410> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Jan 26 19:27:28 compute-0 nova_compute[182213]: libvirt:  error : internal error: could not initialize domain event timer
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.912 182217 WARNING nova.virt.libvirt.host [None req-c2efb2b0-41be-4fb3-aa0a-095575633fed - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.912 182217 DEBUG nova.virt.libvirt.host [None req-c2efb2b0-41be-4fb3-aa0a-095575633fed - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f5b8938b410> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.914 182217 DEBUG nova.virt.libvirt.host [None req-c2efb2b0-41be-4fb3-aa0a-095575633fed - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.915 182217 DEBUG nova.virt.libvirt.host [None req-c2efb2b0-41be-4fb3-aa0a-095575633fed - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.915 182217 INFO nova.utils [None req-c2efb2b0-41be-4fb3-aa0a-095575633fed - - - - - -] The default thread pool MainProcess.default is initialized
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.916 182217 DEBUG nova.virt.libvirt.host [None req-c2efb2b0-41be-4fb3-aa0a-095575633fed - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Jan 26 19:27:28 compute-0 nova_compute[182213]: 2026-01-26 19:27:28.916 182217 INFO nova.virt.libvirt.driver [None req-c2efb2b0-41be-4fb3-aa0a-095575633fed - - - - - -] Connection event '1' reason 'None'
Jan 26 19:27:29 compute-0 nova_compute[182213]: 2026-01-26 19:27:29.433 182217 WARNING nova.virt.libvirt.driver [None req-c2efb2b0-41be-4fb3-aa0a-095575633fed - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 26 19:27:29 compute-0 nova_compute[182213]: 2026-01-26 19:27:29.434 182217 DEBUG nova.virt.libvirt.volume.mount [None req-c2efb2b0-41be-4fb3-aa0a-095575633fed - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 26 19:27:29 compute-0 sudo[183086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pplyzapamywdcnmmplzfwtcrzhqshlao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455649.0722537-2424-196493797338316/AnsiballZ_systemd.py'
Jan 26 19:27:29 compute-0 sudo[183086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:29 compute-0 python3.9[183088]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:27:29 compute-0 systemd[1]: Stopping nova_compute container...
Jan 26 19:27:29 compute-0 nova_compute[182213]: 2026-01-26 19:27:29.888 182217 DEBUG oslo_concurrency.lockutils [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:27:29 compute-0 nova_compute[182213]: 2026-01-26 19:27:29.889 182217 DEBUG oslo_concurrency.lockutils [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:27:29 compute-0 nova_compute[182213]: 2026-01-26 19:27:29.889 182217 DEBUG oslo_concurrency.lockutils [None req-abe05272-a83c-4faa-9243-d9a7bd722181 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:27:30 compute-0 virtqemud[182929]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 26 19:27:30 compute-0 virtqemud[182929]: hostname: compute-0
Jan 26 19:27:30 compute-0 virtqemud[182929]: End of file while reading data: Input/output error
Jan 26 19:27:30 compute-0 systemd[1]: libpod-a80736d754488063f9d59348212f5ce64b0e96dd00e26a6328b7decaab624f52.scope: Deactivated successfully.
Jan 26 19:27:30 compute-0 systemd[1]: libpod-a80736d754488063f9d59348212f5ce64b0e96dd00e26a6328b7decaab624f52.scope: Consumed 3.288s CPU time.
Jan 26 19:27:30 compute-0 podman[183100]: 2026-01-26 19:27:30.575452845 +0000 UTC m=+0.747756843 container died a80736d754488063f9d59348212f5ce64b0e96dd00e26a6328b7decaab624f52 (image=38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=edpm, org.label-schema.build-date=20260120, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:27:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a80736d754488063f9d59348212f5ce64b0e96dd00e26a6328b7decaab624f52-userdata-shm.mount: Deactivated successfully.
Jan 26 19:27:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-e101dee1ea6f60bfd3c28041328987a4d14d2341a735542f72277b5642ef6ed5-merged.mount: Deactivated successfully.
Jan 26 19:27:30 compute-0 podman[183100]: 2026-01-26 19:27:30.662305885 +0000 UTC m=+0.834609813 container cleanup a80736d754488063f9d59348212f5ce64b0e96dd00e26a6328b7decaab624f52 (image=38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 19:27:30 compute-0 podman[183100]: nova_compute
Jan 26 19:27:30 compute-0 podman[183116]: 2026-01-26 19:27:30.698921976 +0000 UTC m=+0.089786141 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0)
Jan 26 19:27:30 compute-0 podman[183146]: nova_compute
Jan 26 19:27:30 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 26 19:27:30 compute-0 systemd[1]: Stopped nova_compute container.
Jan 26 19:27:30 compute-0 systemd[1]: Starting nova_compute container...
Jan 26 19:27:30 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:27:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e101dee1ea6f60bfd3c28041328987a4d14d2341a735542f72277b5642ef6ed5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 26 19:27:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e101dee1ea6f60bfd3c28041328987a4d14d2341a735542f72277b5642ef6ed5/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 26 19:27:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e101dee1ea6f60bfd3c28041328987a4d14d2341a735542f72277b5642ef6ed5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 26 19:27:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e101dee1ea6f60bfd3c28041328987a4d14d2341a735542f72277b5642ef6ed5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 19:27:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e101dee1ea6f60bfd3c28041328987a4d14d2341a735542f72277b5642ef6ed5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 26 19:27:30 compute-0 podman[183161]: 2026-01-26 19:27:30.869655495 +0000 UTC m=+0.103552063 container init a80736d754488063f9d59348212f5ce64b0e96dd00e26a6328b7decaab624f52 (image=38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 26 19:27:30 compute-0 podman[183161]: 2026-01-26 19:27:30.877704112 +0000 UTC m=+0.111600640 container start a80736d754488063f9d59348212f5ce64b0e96dd00e26a6328b7decaab624f52 (image=38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 19:27:30 compute-0 podman[183161]: nova_compute
Jan 26 19:27:30 compute-0 nova_compute[183177]: + sudo -E kolla_set_configs
Jan 26 19:27:30 compute-0 systemd[1]: Started nova_compute container.
Jan 26 19:27:30 compute-0 sudo[183086]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Validating config file
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Copying service configuration files
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Deleting /etc/ceph
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Creating directory /etc/ceph
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Setting permission for /etc/ceph
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Writing out command to execute
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 26 19:27:30 compute-0 nova_compute[183177]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 26 19:27:30 compute-0 nova_compute[183177]: ++ cat /run_command
Jan 26 19:27:30 compute-0 nova_compute[183177]: + CMD=nova-compute
Jan 26 19:27:30 compute-0 nova_compute[183177]: + ARGS=
Jan 26 19:27:30 compute-0 nova_compute[183177]: + sudo kolla_copy_cacerts
Jan 26 19:27:31 compute-0 nova_compute[183177]: + [[ ! -n '' ]]
Jan 26 19:27:31 compute-0 nova_compute[183177]: + . kolla_extend_start
Jan 26 19:27:31 compute-0 nova_compute[183177]: + echo 'Running command: '\''nova-compute'\'''
Jan 26 19:27:31 compute-0 nova_compute[183177]: Running command: 'nova-compute'
Jan 26 19:27:31 compute-0 nova_compute[183177]: + umask 0022
Jan 26 19:27:31 compute-0 nova_compute[183177]: + exec nova-compute
Jan 26 19:27:31 compute-0 sudo[183338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cympnqxagzhskakflfizsqvqzuutfmtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455651.1479814-2442-230408117975939/AnsiballZ_podman_container.py'
Jan 26 19:27:31 compute-0 sudo[183338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:31 compute-0 python3.9[183340]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 26 19:27:32 compute-0 systemd[1]: Started libpod-conmon-a07fba89c8e06a7f53a1f08574757a8a03ba3fe6b7a2888f7d378fe7c6874bf0.scope.
Jan 26 19:27:32 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:27:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d12d2947541d8938b16a96f6d85fa0c906d1292ef609298286ae7259c9e1a0c9/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 26 19:27:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d12d2947541d8938b16a96f6d85fa0c906d1292ef609298286ae7259c9e1a0c9/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 26 19:27:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d12d2947541d8938b16a96f6d85fa0c906d1292ef609298286ae7259c9e1a0c9/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 26 19:27:32 compute-0 podman[183361]: 2026-01-26 19:27:32.099956683 +0000 UTC m=+0.171455180 container init a07fba89c8e06a7f53a1f08574757a8a03ba3fe6b7a2888f7d378fe7c6874bf0 (image=38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=nova_compute_init, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 19:27:32 compute-0 podman[183361]: 2026-01-26 19:27:32.110953461 +0000 UTC m=+0.182451908 container start a07fba89c8e06a7f53a1f08574757a8a03ba3fe6b7a2888f7d378fe7c6874bf0 (image=38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:27:32 compute-0 python3.9[183340]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Applying nova statedir ownership
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 26 19:27:32 compute-0 nova_compute_init[183383]: INFO:nova_statedir:Nova statedir ownership complete
Jan 26 19:27:32 compute-0 systemd[1]: libpod-a07fba89c8e06a7f53a1f08574757a8a03ba3fe6b7a2888f7d378fe7c6874bf0.scope: Deactivated successfully.
Jan 26 19:27:32 compute-0 podman[183398]: 2026-01-26 19:27:32.216576889 +0000 UTC m=+0.033301312 container died a07fba89c8e06a7f53a1f08574757a8a03ba3fe6b7a2888f7d378fe7c6874bf0 (image=38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Jan 26 19:27:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a07fba89c8e06a7f53a1f08574757a8a03ba3fe6b7a2888f7d378fe7c6874bf0-userdata-shm.mount: Deactivated successfully.
Jan 26 19:27:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-d12d2947541d8938b16a96f6d85fa0c906d1292ef609298286ae7259c9e1a0c9-merged.mount: Deactivated successfully.
Jan 26 19:27:32 compute-0 podman[183398]: 2026-01-26 19:27:32.283012346 +0000 UTC m=+0.099736779 container cleanup a07fba89c8e06a7f53a1f08574757a8a03ba3fe6b7a2888f7d378fe7c6874bf0 (image=38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.102.83.223:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Jan 26 19:27:32 compute-0 sudo[183338]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:32 compute-0 systemd[1]: libpod-conmon-a07fba89c8e06a7f53a1f08574757a8a03ba3fe6b7a2888f7d378fe7c6874bf0.scope: Deactivated successfully.
Jan 26 19:27:32 compute-0 sshd-session[160061]: Connection closed by 192.168.122.30 port 35026
Jan 26 19:27:32 compute-0 sshd-session[160058]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:27:32 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Jan 26 19:27:32 compute-0 systemd[1]: session-24.scope: Consumed 1min 55.287s CPU time.
Jan 26 19:27:32 compute-0 systemd-logind[794]: Session 24 logged out. Waiting for processes to exit.
Jan 26 19:27:32 compute-0 systemd-logind[794]: Removed session 24.
Jan 26 19:27:33 compute-0 nova_compute[183177]: 2026-01-26 19:27:33.002 183181 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 26 19:27:33 compute-0 nova_compute[183177]: 2026-01-26 19:27:33.002 183181 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 26 19:27:33 compute-0 nova_compute[183177]: 2026-01-26 19:27:33.002 183181 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 26 19:27:33 compute-0 nova_compute[183177]: 2026-01-26 19:27:33.003 183181 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 26 19:27:33 compute-0 nova_compute[183177]: 2026-01-26 19:27:33.111 183181 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:27:33 compute-0 nova_compute[183177]: 2026-01-26 19:27:33.121 183181 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:27:33 compute-0 nova_compute[183177]: 2026-01-26 19:27:33.122 183181 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Jan 26 19:27:33 compute-0 nova_compute[183177]: 2026-01-26 19:27:33.152 183181 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Jan 26 19:27:33 compute-0 nova_compute[183177]: 2026-01-26 19:27:33.153 183181 WARNING oslo_config.cfg [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Jan 26 19:27:34 compute-0 nova_compute[183177]: 2026-01-26 19:27:34.466 183181 INFO nova.virt.driver [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 26 19:27:34 compute-0 nova_compute[183177]: 2026-01-26 19:27:34.564 183181 INFO nova.compute.provider_config [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.153 183181 DEBUG oslo_concurrency.lockutils [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.153 183181 DEBUG oslo_concurrency.lockutils [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.154 183181 DEBUG oslo_concurrency.lockutils [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.154 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.154 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.154 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.155 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.155 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.155 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.155 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.155 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.156 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.156 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.156 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.156 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.156 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.156 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.157 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.157 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.157 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.157 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.157 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.157 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.157 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.158 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.158 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.158 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.158 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.158 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.158 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.159 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.159 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.159 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.159 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.159 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.159 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.160 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.160 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.160 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.160 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.160 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.160 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.161 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.161 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.161 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.161 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.161 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.161 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.162 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.162 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.162 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.162 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.162 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.162 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.163 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.163 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.163 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.163 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.163 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.163 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.164 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.164 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.164 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.164 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.164 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.164 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.164 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.165 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.165 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.165 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.165 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.165 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.165 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.165 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.166 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.166 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.166 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.166 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.166 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.166 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.167 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.167 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.167 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.167 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.167 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.167 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.168 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.168 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.168 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.168 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.168 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.168 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.168 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.169 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.169 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.169 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.169 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.169 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.169 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.170 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.170 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.170 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.170 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.170 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.170 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.170 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.171 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.171 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.171 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.171 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.171 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.171 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.172 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.172 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.172 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.172 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.172 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.172 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.172 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.173 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.173 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.173 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.173 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.173 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.173 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.174 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.174 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.174 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.174 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.174 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.174 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.174 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.175 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.175 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.175 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.175 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.175 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.175 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.175 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.176 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.176 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.176 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.176 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.176 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.176 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.176 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.177 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.177 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.177 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.177 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.177 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.177 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.178 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.178 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.178 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.178 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.178 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.178 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.179 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.179 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.179 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.179 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.179 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.179 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.180 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.180 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.180 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.180 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.180 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.180 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.180 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.181 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.181 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.181 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.181 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.181 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.181 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.182 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.182 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.182 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.182 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.182 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.182 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.183 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.183 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.183 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.183 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.183 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.183 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.184 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.184 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.184 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.184 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.184 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.184 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.184 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.185 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.185 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.185 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.185 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.185 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.185 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.186 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.186 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.186 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.186 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.186 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.186 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.186 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.187 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.187 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.187 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.187 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.187 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.187 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.188 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.188 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.188 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.188 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.188 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.188 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.188 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.189 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.189 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.189 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.189 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.189 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.189 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.190 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.190 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.190 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.190 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.190 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.190 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.191 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.191 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.191 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.191 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.191 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.191 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.191 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.192 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.192 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.192 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.192 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.192 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.192 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.193 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.193 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.193 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.193 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.193 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.193 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.194 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.194 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.194 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.194 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.194 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.194 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.194 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.195 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.195 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.195 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.195 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.195 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.195 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.196 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.196 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.196 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.196 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.196 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.196 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.197 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.197 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.197 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.197 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.197 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.198 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.198 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.198 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.198 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.198 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.199 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.199 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.199 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.199 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.199 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.199 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.200 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.200 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.200 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.200 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.200 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.200 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.201 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.201 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.201 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.201 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.201 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.201 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.201 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.202 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.202 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.202 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.202 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.202 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.202 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.203 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.203 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.203 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.203 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.203 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.203 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.203 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.204 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.204 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.204 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.204 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.204 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.204 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.205 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.205 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.205 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.205 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.205 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.205 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.205 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.206 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.206 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.206 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.206 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.206 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.207 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.207 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.207 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.207 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.207 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.207 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.208 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.208 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.208 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.208 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.208 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.208 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.209 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.209 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.209 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.209 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.209 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.209 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.209 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.210 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.210 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.210 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.210 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.210 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.211 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.211 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.211 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.211 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.211 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.211 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.211 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.212 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.212 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.212 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.212 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.212 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.212 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.213 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.213 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.213 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.213 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.213 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.213 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.213 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.214 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.214 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.214 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.214 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.214 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.215 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.215 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.215 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.215 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.215 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.215 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.216 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.216 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.216 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.216 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.216 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.216 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.217 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.217 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.217 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.217 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.217 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.217 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.217 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.218 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.218 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.218 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.218 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.218 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.218 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.219 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.219 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.219 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.219 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.219 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.219 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.220 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.220 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.220 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.220 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.220 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.220 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.221 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.221 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.221 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.221 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.221 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.221 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.222 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.222 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.222 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.222 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.222 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.222 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.223 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.223 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.223 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.223 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.223 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.223 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.224 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.224 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.224 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.224 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.224 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.224 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.224 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.225 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.225 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.225 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.225 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.225 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.225 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.226 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.226 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.226 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.226 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.226 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.226 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.227 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.227 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.227 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.227 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.227 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.227 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.228 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.228 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.228 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.228 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.228 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.228 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.228 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.229 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.229 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.229 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.229 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.229 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.229 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.230 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.230 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.230 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.230 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.230 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.230 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.231 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.231 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.231 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.231 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.231 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.231 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.232 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.232 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.232 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.232 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.232 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.232 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.233 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.233 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.233 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.233 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.233 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.233 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.233 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.234 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.234 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.234 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.234 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.234 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.234 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.235 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.235 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.235 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.235 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.235 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.235 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.236 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.236 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.236 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.236 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.236 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.237 183181 WARNING oslo_config.cfg [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 26 19:27:35 compute-0 nova_compute[183177]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 26 19:27:35 compute-0 nova_compute[183177]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 26 19:27:35 compute-0 nova_compute[183177]: and ``live_migration_inbound_addr`` respectively.
Jan 26 19:27:35 compute-0 nova_compute[183177]: ).  Its value may be silently ignored in the future.
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.237 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.237 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.237 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.237 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.237 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.238 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.238 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.238 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.238 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.238 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.239 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.239 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.239 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.239 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.239 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.239 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.240 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.240 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.240 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.240 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.240 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.240 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.241 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.241 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.241 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.241 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.241 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.241 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.242 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.242 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.242 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.242 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.242 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.243 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.243 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.243 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.243 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.243 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.243 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.244 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.244 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.244 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.244 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.244 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.244 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.245 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.245 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.245 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.245 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.245 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.245 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.246 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.246 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.246 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.246 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.246 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.247 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.247 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.247 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.247 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.247 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.247 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.248 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.248 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.248 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.248 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.248 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.248 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.249 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.249 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.249 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.249 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.249 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.249 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.250 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.250 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.250 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.250 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.250 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.250 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.251 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.251 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.251 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.251 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.251 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.251 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.252 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.252 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.252 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.252 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.252 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.252 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.253 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.253 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.253 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.253 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.253 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.254 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.254 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.254 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.254 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.254 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.254 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.255 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.255 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.255 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.255 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.255 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.255 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.255 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.256 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.256 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.256 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.256 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.256 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.256 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.257 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.257 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.257 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.257 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.257 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.257 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.258 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.258 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.258 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.258 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.258 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.258 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.258 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.259 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.259 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.259 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.259 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.259 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.259 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.260 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.260 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.260 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.260 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.260 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.260 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.260 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.261 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.261 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.261 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.261 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.261 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.261 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.262 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.262 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.262 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.262 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.262 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.262 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.263 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.263 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.263 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.263 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.263 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.263 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.263 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.264 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.264 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.264 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.264 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.264 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.264 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.265 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.265 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.265 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.265 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.265 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.266 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.266 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.266 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.266 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.266 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.266 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.266 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.267 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.267 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.267 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.267 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.267 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.267 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.268 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.268 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.268 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.268 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.268 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.268 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.269 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.269 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.269 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.269 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.269 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.269 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.270 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.270 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.270 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.270 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.270 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.270 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.271 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.271 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.271 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.271 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.271 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.271 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.272 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.272 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.272 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.272 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.272 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.272 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.273 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.273 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.273 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.273 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.273 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.273 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.274 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.274 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.274 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.274 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.274 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.274 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.274 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.275 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.275 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.275 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.275 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.275 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.275 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.276 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.276 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.276 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.276 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.276 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.276 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.276 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.277 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.277 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.277 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.277 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.277 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.277 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.277 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.278 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.278 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.278 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.278 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.278 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.278 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.279 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.279 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.279 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.279 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.279 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.280 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.280 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.280 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.280 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.280 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.280 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.281 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.281 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.281 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.281 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.281 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.281 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.281 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.282 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.282 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.282 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.282 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.282 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.282 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.283 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.283 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.283 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.283 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.283 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.283 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.283 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.284 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.284 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.284 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.284 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.284 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.284 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.285 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.285 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.285 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.285 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.285 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.285 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.286 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.286 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.286 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.286 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.286 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.286 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.286 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.287 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.287 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.287 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.287 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.287 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.287 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.288 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.288 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.288 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.288 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.288 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.288 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.288 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.289 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.289 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.289 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.289 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.289 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.289 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.290 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.290 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.290 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.290 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.290 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.290 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.290 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.291 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.291 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.291 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.291 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.291 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.291 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.292 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.292 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.292 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.292 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.292 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.292 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.293 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.293 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.293 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.293 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.293 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.293 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.294 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.294 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.294 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.294 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.294 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.294 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.294 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.295 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.295 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.295 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.295 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.295 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.295 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.295 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.295 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.295 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.296 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.296 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.296 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.296 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.296 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.296 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.296 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.297 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.297 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.297 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.297 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.297 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.297 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.297 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.297 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.298 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.298 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.298 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.298 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.298 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.298 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.298 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.298 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.299 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.299 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.299 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.299 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.299 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.299 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.300 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.300 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.300 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.300 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.300 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.300 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.300 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.301 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.301 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.301 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.301 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.301 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.301 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.301 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.301 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.302 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.302 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.302 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.302 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.302 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.302 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.302 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.303 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.303 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.303 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.303 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.303 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.303 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.304 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.304 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.304 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.304 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.304 183181 DEBUG oslo_service.backend._eventlet.service [None req-6af33377-c5f3-49e0-a610-f3d13f1912b6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.305 183181 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.813 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.827 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f7cc5ed7380> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Jan 26 19:27:35 compute-0 nova_compute[183177]: libvirt:  error : internal error: could not initialize domain event timer
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.828 183181 WARNING nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.828 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f7cc5ed7380> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.830 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.830 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.831 183181 INFO nova.utils [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] The default thread pool MainProcess.default is initialized
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.831 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.831 183181 INFO nova.virt.libvirt.driver [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Connection event '1' reason 'None'
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.842 183181 INFO nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Libvirt host capabilities <capabilities>
Jan 26 19:27:35 compute-0 nova_compute[183177]: 
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <host>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <uuid>a7a0dc8c-0440-40bb-835e-0c8b31a79067</uuid>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <cpu>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <arch>x86_64</arch>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model>EPYC-Rome-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <vendor>AMD</vendor>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <microcode version='16777317'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <signature family='23' model='49' stepping='0'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='x2apic'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='tsc-deadline'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='osxsave'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='hypervisor'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='tsc_adjust'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='spec-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='stibp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='arch-capabilities'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='ssbd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='cmp_legacy'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='topoext'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='virt-ssbd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='lbrv'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='tsc-scale'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='vmcb-clean'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='pause-filter'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='pfthreshold'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='svme-addr-chk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='rdctl-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='skip-l1dfl-vmentry'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='mds-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature name='pschange-mc-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <pages unit='KiB' size='4'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <pages unit='KiB' size='2048'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <pages unit='KiB' size='1048576'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </cpu>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <power_management>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <suspend_mem/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <suspend_disk/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <suspend_hybrid/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </power_management>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <iommu support='no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <migration_features>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <live/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <uri_transports>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <uri_transport>tcp</uri_transport>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <uri_transport>rdma</uri_transport>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </uri_transports>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </migration_features>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <topology>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <cells num='1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <cell id='0'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:           <memory unit='KiB'>7864316</memory>
Jan 26 19:27:35 compute-0 nova_compute[183177]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 26 19:27:35 compute-0 nova_compute[183177]:           <pages unit='KiB' size='2048'>0</pages>
Jan 26 19:27:35 compute-0 nova_compute[183177]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 26 19:27:35 compute-0 nova_compute[183177]:           <distances>
Jan 26 19:27:35 compute-0 nova_compute[183177]:             <sibling id='0' value='10'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:           </distances>
Jan 26 19:27:35 compute-0 nova_compute[183177]:           <cpus num='8'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:           </cpus>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         </cell>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </cells>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </topology>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <cache>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </cache>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <secmodel>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model>selinux</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <doi>0</doi>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </secmodel>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <secmodel>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model>dac</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <doi>0</doi>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </secmodel>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   </host>
Jan 26 19:27:35 compute-0 nova_compute[183177]: 
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <guest>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <os_type>hvm</os_type>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <arch name='i686'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <wordsize>32</wordsize>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <domain type='qemu'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <domain type='kvm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </arch>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <features>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <pae/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <nonpae/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <acpi default='on' toggle='yes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <apic default='on' toggle='no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <cpuselection/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <deviceboot/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <disksnapshot default='on' toggle='no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <externalSnapshot/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </features>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   </guest>
Jan 26 19:27:35 compute-0 nova_compute[183177]: 
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <guest>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <os_type>hvm</os_type>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <arch name='x86_64'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <wordsize>64</wordsize>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <domain type='qemu'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <domain type='kvm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </arch>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <features>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <acpi default='on' toggle='yes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <apic default='on' toggle='no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <cpuselection/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <deviceboot/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <disksnapshot default='on' toggle='no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <externalSnapshot/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </features>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   </guest>
Jan 26 19:27:35 compute-0 nova_compute[183177]: 
Jan 26 19:27:35 compute-0 nova_compute[183177]: </capabilities>
Jan 26 19:27:35 compute-0 nova_compute[183177]: 
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.851 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.879 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 26 19:27:35 compute-0 nova_compute[183177]: <domainCapabilities>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <domain>kvm</domain>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <arch>i686</arch>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <vcpu max='240'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <iothreads supported='yes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <os supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <enum name='firmware'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <loader supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>rom</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>pflash</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='readonly'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>yes</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>no</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='secure'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>no</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </loader>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   </os>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <cpu>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <mode name='host-passthrough' supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='hostPassthroughMigratable'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>on</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>off</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <mode name='maximum' supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='maximumMigratable'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>on</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>off</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <mode name='host-model' supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <vendor>AMD</vendor>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='x2apic'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='hypervisor'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='stibp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='ssbd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='overflow-recov'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='succor'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='ibrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='lbrv'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='tsc-scale'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='flushbyasid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='pause-filter'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='pfthreshold'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='disable' name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <mode name='custom' supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-IBRS'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-noTSX'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v4'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='ClearwaterForest'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bhi-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ddpd-u'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sha512'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sm3'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sm4'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='ClearwaterForest-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bhi-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ddpd-u'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sha512'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sm3'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sm4'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cooperlake'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cooperlake-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cooperlake-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Denverton'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Denverton-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Denverton-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Denverton-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Dhyana-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Genoa'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fs-gs-base-ns'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='perfmon-v2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Turin'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vp2intersect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fs-gs-base-ns'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibpb-brtype'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='perfmon-v2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbpb'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='srso-user-kernel-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Turin-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vp2intersect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fs-gs-base-ns'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibpb-brtype'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='perfmon-v2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbpb'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='srso-user-kernel-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-v4'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-v5'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10-128'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10-256'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10-512'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10-128'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10-256'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10-512'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell-IBRS'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell-noTSX'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell-v4'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v4'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v5'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v6'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v7'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='IvyBridge'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='IvyBridge-IBRS'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='IvyBridge-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='IvyBridge-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='KnightsMill'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-4fmaps'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-4vnniw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512er'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512pf'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='KnightsMill-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-4fmaps'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-4vnniw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512er'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512pf'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Opteron_G4'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Opteron_G4-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Opteron_G5'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tbm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Opteron_G5-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tbm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v4'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='SierraForest'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='SierraForest-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='SierraForest-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='SierraForest-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v4'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v4'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v5'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Snowridge'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v4'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='athlon'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='athlon-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='core2duo'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='core2duo-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='coreduo'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='coreduo-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='n270'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='n270-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='phenom'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='phenom-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <memoryBacking supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <enum name='sourceType'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <value>file</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <value>anonymous</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <value>memfd</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   </memoryBacking>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <disk supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='diskDevice'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>disk</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>cdrom</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>floppy</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>lun</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='bus'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>ide</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>fdc</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>scsi</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>usb</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>sata</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>virtio-transitional</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>virtio-non-transitional</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <graphics supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>vnc</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>egl-headless</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>dbus</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <video supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='modelType'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>vga</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>cirrus</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>none</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>bochs</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>ramfb</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </video>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <hostdev supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='mode'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>subsystem</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='startupPolicy'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>default</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>mandatory</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>requisite</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>optional</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='subsysType'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>usb</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>pci</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>scsi</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='capsType'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='pciBackend'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </hostdev>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <rng supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>virtio-transitional</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>virtio-non-transitional</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='backendModel'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>random</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>egd</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>builtin</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <filesystem supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='driverType'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>path</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>handle</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>virtiofs</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </filesystem>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <tpm supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>tpm-tis</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>tpm-crb</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='backendModel'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>emulator</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>external</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='backendVersion'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>2.0</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </tpm>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <redirdev supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='bus'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>usb</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </redirdev>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <channel supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>pty</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>unix</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </channel>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <crypto supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='model'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>qemu</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='backendModel'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>builtin</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </crypto>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <interface supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='backendType'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>default</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>passt</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <panic supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>isa</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>hyperv</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </panic>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <console supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>null</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>vc</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>pty</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>dev</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>file</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>pipe</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>stdio</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>udp</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>tcp</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>unix</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>qemu-vdagent</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>dbus</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </console>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <features>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <gic supported='no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <vmcoreinfo supported='yes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <genid supported='yes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <backingStoreInput supported='yes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <backup supported='yes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <async-teardown supported='yes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <s390-pv supported='no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <ps2 supported='yes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <tdx supported='no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <sev supported='no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <sgx supported='no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <hyperv supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='features'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>relaxed</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>vapic</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>spinlocks</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>vpindex</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>runtime</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>synic</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>stimer</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>reset</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>vendor_id</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>frequencies</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>reenlightenment</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>tlbflush</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>ipi</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>avic</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>emsr_bitmap</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>xmm_input</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <defaults>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <spinlocks>4095</spinlocks>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <stimer_direct>on</stimer_direct>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </defaults>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </hyperv>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <launchSecurity supported='no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   </features>
Jan 26 19:27:35 compute-0 nova_compute[183177]: </domainCapabilities>
Jan 26 19:27:35 compute-0 nova_compute[183177]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 26 19:27:35 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.891 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 26 19:27:35 compute-0 nova_compute[183177]: <domainCapabilities>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <domain>kvm</domain>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <arch>i686</arch>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <vcpu max='4096'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <iothreads supported='yes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <os supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <enum name='firmware'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <loader supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>rom</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>pflash</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='readonly'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>yes</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>no</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='secure'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>no</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </loader>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   </os>
Jan 26 19:27:35 compute-0 nova_compute[183177]:   <cpu>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <mode name='host-passthrough' supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='hostPassthroughMigratable'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>on</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>off</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <mode name='maximum' supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <enum name='maximumMigratable'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>on</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <value>off</value>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <mode name='host-model' supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <vendor>AMD</vendor>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='x2apic'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='hypervisor'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='stibp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='ssbd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='overflow-recov'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='succor'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='ibrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='lbrv'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='tsc-scale'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='flushbyasid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='pause-filter'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='pfthreshold'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <feature policy='disable' name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:35 compute-0 nova_compute[183177]:     <mode name='custom' supported='yes'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-IBRS'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-noTSX'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v4'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='ClearwaterForest'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bhi-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ddpd-u'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sha512'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sm3'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sm4'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='ClearwaterForest-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bhi-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ddpd-u'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sha512'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sm3'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sm4'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cooperlake'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cooperlake-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Cooperlake-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Denverton'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Denverton-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Denverton-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Denverton-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Dhyana-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Genoa'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fs-gs-base-ns'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='perfmon-v2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Turin'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vp2intersect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fs-gs-base-ns'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibpb-brtype'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='perfmon-v2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbpb'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='srso-user-kernel-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-Turin-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vp2intersect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fs-gs-base-ns'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibpb-brtype'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='perfmon-v2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbpb'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='srso-user-kernel-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-v4'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='EPYC-v5'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10-128'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10-256'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10-512'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids-v3'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10-128'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10-256'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx10-512'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell-IBRS'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell-noTSX'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell-v1'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell-v2'>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 19:27:35 compute-0 nova_compute[183177]:       <blockers model='Haswell-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v5'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v6'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v7'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='IvyBridge'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='IvyBridge-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='IvyBridge-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='IvyBridge-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='KnightsMill'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-4fmaps'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-4vnniw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512er'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512pf'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='KnightsMill-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-4fmaps'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-4vnniw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512er'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512pf'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Opteron_G4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Opteron_G4-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Opteron_G5'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tbm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Opteron_G5-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tbm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SierraForest'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SierraForest-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SierraForest-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SierraForest-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v5'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='athlon'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='athlon-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='core2duo'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='core2duo-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='coreduo'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='coreduo-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='n270'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='n270-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='phenom'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='phenom-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <memoryBacking supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <enum name='sourceType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>file</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>anonymous</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>memfd</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </memoryBacking>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <disk supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='diskDevice'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>disk</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>cdrom</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>floppy</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>lun</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='bus'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>fdc</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>scsi</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>usb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>sata</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio-transitional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio-non-transitional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <graphics supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vnc</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>egl-headless</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>dbus</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <video supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='modelType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vga</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>cirrus</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>none</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>bochs</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>ramfb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </video>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <hostdev supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='mode'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>subsystem</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='startupPolicy'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>default</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>mandatory</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>requisite</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>optional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='subsysType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>usb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pci</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>scsi</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='capsType'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='pciBackend'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </hostdev>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <rng supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio-transitional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio-non-transitional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendModel'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>random</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>egd</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>builtin</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <filesystem supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='driverType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>path</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>handle</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtiofs</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </filesystem>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <tpm supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>tpm-tis</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>tpm-crb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendModel'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>emulator</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>external</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendVersion'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>2.0</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </tpm>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <redirdev supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='bus'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>usb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </redirdev>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <channel supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pty</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>unix</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </channel>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <crypto supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>qemu</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendModel'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>builtin</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </crypto>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <interface supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>default</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>passt</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <panic supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>isa</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>hyperv</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </panic>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <console supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>null</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vc</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pty</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>dev</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>file</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pipe</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>stdio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>udp</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>tcp</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>unix</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>qemu-vdagent</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>dbus</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </console>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <features>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <gic supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <vmcoreinfo supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <genid supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <backingStoreInput supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <backup supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <async-teardown supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <s390-pv supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <ps2 supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <tdx supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <sev supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <sgx supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <hyperv supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='features'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>relaxed</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vapic</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>spinlocks</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vpindex</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>runtime</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>synic</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>stimer</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>reset</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vendor_id</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>frequencies</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>reenlightenment</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>tlbflush</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>ipi</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>avic</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>emsr_bitmap</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>xmm_input</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <defaults>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <spinlocks>4095</spinlocks>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <stimer_direct>on</stimer_direct>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </defaults>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </hyperv>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <launchSecurity supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </features>
Jan 26 19:27:36 compute-0 nova_compute[183177]: </domainCapabilities>
Jan 26 19:27:36 compute-0 nova_compute[183177]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.968 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:35.975 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 26 19:27:36 compute-0 nova_compute[183177]: <domainCapabilities>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <domain>kvm</domain>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <arch>x86_64</arch>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <vcpu max='240'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <iothreads supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <os supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <enum name='firmware'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <loader supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>rom</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pflash</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='readonly'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>yes</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>no</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='secure'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>no</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </loader>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </os>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <cpu>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <mode name='host-passthrough' supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='hostPassthroughMigratable'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>on</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>off</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <mode name='maximum' supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='maximumMigratable'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>on</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>off</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <mode name='host-model' supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <vendor>AMD</vendor>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='x2apic'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='hypervisor'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='stibp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='ssbd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='overflow-recov'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='succor'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='ibrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='lbrv'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='tsc-scale'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='flushbyasid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='pause-filter'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='pfthreshold'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='disable' name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <mode name='custom' supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-noTSX'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='ClearwaterForest'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ddpd-u'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sha512'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sm3'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sm4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='ClearwaterForest-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ddpd-u'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sha512'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sm3'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sm4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cooperlake'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cooperlake-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cooperlake-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Denverton'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Denverton-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Denverton-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Denverton-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Dhyana-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Genoa'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fs-gs-base-ns'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='perfmon-v2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Turin'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vp2intersect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fs-gs-base-ns'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibpb-brtype'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='perfmon-v2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbpb'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='srso-user-kernel-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Turin-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vp2intersect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fs-gs-base-ns'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibpb-brtype'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='perfmon-v2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbpb'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='srso-user-kernel-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-v5'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10-128'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10-256'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10-512'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10-128'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10-256'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10-512'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-noTSX'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v5'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v6'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v7'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='IvyBridge'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='IvyBridge-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='IvyBridge-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='IvyBridge-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='KnightsMill'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-4fmaps'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-4vnniw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512er'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512pf'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='KnightsMill-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-4fmaps'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-4vnniw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512er'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512pf'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Opteron_G4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Opteron_G4-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Opteron_G5'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tbm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Opteron_G5-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tbm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SierraForest'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SierraForest-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SierraForest-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SierraForest-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v5'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='athlon'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='athlon-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='core2duo'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='core2duo-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='coreduo'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='coreduo-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='n270'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='n270-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='phenom'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='phenom-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <memoryBacking supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <enum name='sourceType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>file</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>anonymous</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>memfd</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </memoryBacking>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <disk supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='diskDevice'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>disk</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>cdrom</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>floppy</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>lun</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='bus'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>ide</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>fdc</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>scsi</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>usb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>sata</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio-transitional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio-non-transitional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <graphics supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vnc</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>egl-headless</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>dbus</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <video supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='modelType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vga</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>cirrus</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>none</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>bochs</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>ramfb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </video>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <hostdev supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='mode'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>subsystem</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='startupPolicy'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>default</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>mandatory</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>requisite</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>optional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='subsysType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>usb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pci</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>scsi</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='capsType'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='pciBackend'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </hostdev>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <rng supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio-transitional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio-non-transitional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendModel'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>random</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>egd</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>builtin</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <filesystem supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='driverType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>path</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>handle</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtiofs</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </filesystem>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <tpm supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>tpm-tis</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>tpm-crb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendModel'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>emulator</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>external</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendVersion'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>2.0</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </tpm>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <redirdev supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='bus'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>usb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </redirdev>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <channel supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pty</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>unix</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </channel>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <crypto supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>qemu</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendModel'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>builtin</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </crypto>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <interface supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>default</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>passt</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <panic supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>isa</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>hyperv</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </panic>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <console supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>null</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vc</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pty</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>dev</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>file</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pipe</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>stdio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>udp</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>tcp</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>unix</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>qemu-vdagent</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>dbus</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </console>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <features>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <gic supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <vmcoreinfo supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <genid supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <backingStoreInput supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <backup supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <async-teardown supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <s390-pv supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <ps2 supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <tdx supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <sev supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <sgx supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <hyperv supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='features'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>relaxed</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vapic</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>spinlocks</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vpindex</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>runtime</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>synic</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>stimer</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>reset</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vendor_id</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>frequencies</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>reenlightenment</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>tlbflush</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>ipi</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>avic</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>emsr_bitmap</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>xmm_input</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <defaults>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <spinlocks>4095</spinlocks>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <stimer_direct>on</stimer_direct>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </defaults>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </hyperv>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <launchSecurity supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </features>
Jan 26 19:27:36 compute-0 nova_compute[183177]: </domainCapabilities>
Jan 26 19:27:36 compute-0 nova_compute[183177]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:36.047 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 26 19:27:36 compute-0 nova_compute[183177]: <domainCapabilities>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <path>/usr/libexec/qemu-kvm</path>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <domain>kvm</domain>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <arch>x86_64</arch>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <vcpu max='4096'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <iothreads supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <os supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <enum name='firmware'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>efi</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <loader supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>rom</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pflash</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='readonly'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>yes</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>no</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='secure'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>yes</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>no</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </loader>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </os>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <cpu>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <mode name='host-passthrough' supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='hostPassthroughMigratable'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>on</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>off</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <mode name='maximum' supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='maximumMigratable'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>on</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>off</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <mode name='host-model' supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <vendor>AMD</vendor>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='x2apic'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='tsc-deadline'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='hypervisor'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='tsc_adjust'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='spec-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='stibp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='ssbd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='cmp_legacy'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='overflow-recov'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='succor'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='ibrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='amd-ssbd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='virt-ssbd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='lbrv'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='tsc-scale'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='vmcb-clean'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='flushbyasid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='pause-filter'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='pfthreshold'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='svme-addr-chk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <feature policy='disable' name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <mode name='custom' supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-noTSX'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Broadwell-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cascadelake-Server-v5'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='ClearwaterForest'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ddpd-u'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sha512'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sm3'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sm4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='ClearwaterForest-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ddpd-u'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sha512'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sm3'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sm4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cooperlake'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cooperlake-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Cooperlake-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Denverton'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Denverton-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Denverton-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Denverton-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Dhyana-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Genoa'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Genoa-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Genoa-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fs-gs-base-ns'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='perfmon-v2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Milan-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Rome-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Turin'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vp2intersect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fs-gs-base-ns'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibpb-brtype'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='perfmon-v2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbpb'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='srso-user-kernel-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-Turin-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amd-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='auto-ibrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vp2intersect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fs-gs-base-ns'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibpb-brtype'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='no-nested-data-bp'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='null-sel-clr-base'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='perfmon-v2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbpb'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='srso-user-kernel-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='stibp-always-on'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='EPYC-v5'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10-128'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10-256'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10-512'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='GraniteRapids-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10-128'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10-256'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx10-512'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='prefetchiti'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-noTSX'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Haswell-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-noTSX'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v5'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v6'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Icelake-Server-v7'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='IvyBridge'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='IvyBridge-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='IvyBridge-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='IvyBridge-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='KnightsMill'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-4fmaps'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-4vnniw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512er'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512pf'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='KnightsMill-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-4fmaps'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-4vnniw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512er'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512pf'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Opteron_G4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Opteron_G4-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Opteron_G5'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tbm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Opteron_G5-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fma4'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tbm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xop'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SapphireRapids-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='amx-tile'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-bf16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-fp16'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512-vpopcntdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bitalg'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vbmi2'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrc'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fzrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='la57'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='taa-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='tsx-ldtrk'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SierraForest'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SierraForest-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SierraForest-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='SierraForest-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ifma'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-ne-convert'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx-vnni-int8'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bhi-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='bus-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cmpccxadd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fbsdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='fsrs'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ibrs-all'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='intel-psfd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ipred-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='lam'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mcdt-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pbrsb-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='psdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rrsba-ctrl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='sbdr-ssdp-no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='serialize'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vaes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='vpclmulqdq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Client-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='hle'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='rtm'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Skylake-Server-v5'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512bw'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512cd'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512dq'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512f'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='avx512vl'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='invpcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pcid'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='pku'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='mpx'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v2'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v3'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='core-capability'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='split-lock-detect'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='Snowridge-v4'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='cldemote'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='erms'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='gfni'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdir64b'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='movdiri'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='xsaves'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='athlon'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='athlon-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='core2duo'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='core2duo-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='coreduo'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='coreduo-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='n270'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='n270-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='ss'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='phenom'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <blockers model='phenom-v1'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnow'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <feature name='3dnowext'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </blockers>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </mode>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <memoryBacking supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <enum name='sourceType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>file</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>anonymous</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <value>memfd</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </memoryBacking>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <disk supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='diskDevice'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>disk</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>cdrom</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>floppy</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>lun</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='bus'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>fdc</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>scsi</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>usb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>sata</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio-transitional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio-non-transitional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <graphics supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vnc</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>egl-headless</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>dbus</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <video supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='modelType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vga</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>cirrus</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>none</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>bochs</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>ramfb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </video>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <hostdev supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='mode'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>subsystem</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='startupPolicy'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>default</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>mandatory</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>requisite</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>optional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='subsysType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>usb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pci</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>scsi</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='capsType'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='pciBackend'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </hostdev>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <rng supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio-transitional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtio-non-transitional</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendModel'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>random</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>egd</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>builtin</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <filesystem supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='driverType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>path</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>handle</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>virtiofs</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </filesystem>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <tpm supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>tpm-tis</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>tpm-crb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendModel'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>emulator</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>external</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendVersion'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>2.0</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </tpm>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <redirdev supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='bus'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>usb</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </redirdev>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <channel supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pty</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>unix</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </channel>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <crypto supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>qemu</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendModel'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>builtin</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </crypto>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <interface supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='backendType'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>default</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>passt</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <panic supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='model'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>isa</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>hyperv</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </panic>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <console supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='type'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>null</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vc</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pty</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>dev</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>file</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>pipe</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>stdio</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>udp</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>tcp</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>unix</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>qemu-vdagent</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>dbus</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </console>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <features>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <gic supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <vmcoreinfo supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <genid supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <backingStoreInput supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <backup supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <async-teardown supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <s390-pv supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <ps2 supported='yes'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <tdx supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <sev supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <sgx supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <hyperv supported='yes'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <enum name='features'>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>relaxed</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vapic</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>spinlocks</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vpindex</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>runtime</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>synic</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>stimer</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>reset</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>vendor_id</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>frequencies</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>reenlightenment</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>tlbflush</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>ipi</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>avic</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>emsr_bitmap</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <value>xmm_input</value>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </enum>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       <defaults>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <spinlocks>4095</spinlocks>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <stimer_direct>on</stimer_direct>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <tlbflush_direct>on</tlbflush_direct>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <tlbflush_extended>on</tlbflush_extended>
Jan 26 19:27:36 compute-0 nova_compute[183177]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 26 19:27:36 compute-0 nova_compute[183177]:       </defaults>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     </hyperv>
Jan 26 19:27:36 compute-0 nova_compute[183177]:     <launchSecurity supported='no'/>
Jan 26 19:27:36 compute-0 nova_compute[183177]:   </features>
Jan 26 19:27:36 compute-0 nova_compute[183177]: </domainCapabilities>
Jan 26 19:27:36 compute-0 nova_compute[183177]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:36.240 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:36.240 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:36.240 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:36.249 183181 INFO nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Secure Boot support detected
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:36.255 183181 INFO nova.virt.libvirt.driver [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:36.256 183181 INFO nova.virt.libvirt.driver [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:36.341 183181 WARNING nova.virt.libvirt.driver [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:36.343 183181 DEBUG nova.virt.libvirt.volume.mount [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:36.420 183181 DEBUG nova.virt.libvirt.driver [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] cpu compare xml: <cpu match="exact">
Jan 26 19:27:36 compute-0 nova_compute[183177]:   <model>Nehalem</model>
Jan 26 19:27:36 compute-0 nova_compute[183177]: </cpu>
Jan 26 19:27:36 compute-0 nova_compute[183177]:  _compare_cpu /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10922
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:36.422 183181 DEBUG nova.virt.libvirt.driver [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Jan 26 19:27:36 compute-0 nova_compute[183177]: 2026-01-26 19:27:36.934 183181 INFO nova.virt.node [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Determined node identity a47e311f-639f-4d60-b79d-85bbf53e2f35 from /var/lib/nova/compute_id
Jan 26 19:27:37 compute-0 nova_compute[183177]: 2026-01-26 19:27:37.448 183181 WARNING nova.compute.manager [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Compute nodes ['a47e311f-639f-4d60-b79d-85bbf53e2f35'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 26 19:27:38 compute-0 nova_compute[183177]: 2026-01-26 19:27:38.470 183181 INFO nova.compute.manager [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 26 19:27:38 compute-0 sshd-session[183500]: Accepted publickey for zuul from 192.168.122.30 port 35746 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 19:27:38 compute-0 systemd-logind[794]: New session 26 of user zuul.
Jan 26 19:27:38 compute-0 systemd[1]: Started Session 26 of User zuul.
Jan 26 19:27:38 compute-0 sshd-session[183500]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:27:39 compute-0 nova_compute[183177]: 2026-01-26 19:27:39.490 183181 WARNING nova.compute.manager [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 26 19:27:39 compute-0 nova_compute[183177]: 2026-01-26 19:27:39.490 183181 DEBUG oslo_concurrency.lockutils [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:27:39 compute-0 nova_compute[183177]: 2026-01-26 19:27:39.490 183181 DEBUG oslo_concurrency.lockutils [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:27:39 compute-0 nova_compute[183177]: 2026-01-26 19:27:39.490 183181 DEBUG oslo_concurrency.lockutils [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:27:39 compute-0 nova_compute[183177]: 2026-01-26 19:27:39.491 183181 DEBUG nova.compute.resource_tracker [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:27:39 compute-0 nova_compute[183177]: 2026-01-26 19:27:39.656 183181 WARNING nova.virt.libvirt.driver [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:27:39 compute-0 nova_compute[183177]: 2026-01-26 19:27:39.657 183181 DEBUG oslo_concurrency.processutils [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:27:39 compute-0 python3.9[183653]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 26 19:27:39 compute-0 nova_compute[183177]: 2026-01-26 19:27:39.714 183181 DEBUG oslo_concurrency.processutils [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:27:39 compute-0 nova_compute[183177]: 2026-01-26 19:27:39.716 183181 DEBUG nova.compute.resource_tracker [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6205MB free_disk=73.3042106628418GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:27:39 compute-0 nova_compute[183177]: 2026-01-26 19:27:39.716 183181 DEBUG oslo_concurrency.lockutils [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:27:39 compute-0 nova_compute[183177]: 2026-01-26 19:27:39.717 183181 DEBUG oslo_concurrency.lockutils [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:27:40 compute-0 nova_compute[183177]: 2026-01-26 19:27:40.673 183181 WARNING nova.compute.resource_tracker [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] No compute node record for compute-0.ctlplane.example.com:a47e311f-639f-4d60-b79d-85bbf53e2f35: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host a47e311f-639f-4d60-b79d-85bbf53e2f35 could not be found.
Jan 26 19:27:40 compute-0 sudo[183808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eptygaemmpmnlqkjvdpaolyvfxizwyrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455660.3924193-47-141775181541886/AnsiballZ_systemd_service.py'
Jan 26 19:27:40 compute-0 sudo[183808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:41 compute-0 nova_compute[183177]: 2026-01-26 19:27:41.183 183181 INFO nova.compute.resource_tracker [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: a47e311f-639f-4d60-b79d-85bbf53e2f35
Jan 26 19:27:41 compute-0 python3.9[183810]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 19:27:41 compute-0 systemd[1]: Reloading.
Jan 26 19:27:41 compute-0 systemd-sysv-generator[183836]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:27:41 compute-0 systemd-rc-local-generator[183833]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:27:41 compute-0 sudo[183808]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:42 compute-0 python3.9[183995]: ansible-ansible.builtin.service_facts Invoked
Jan 26 19:27:42 compute-0 network[184012]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 26 19:27:42 compute-0 network[184013]: 'network-scripts' will be removed from distribution in near future.
Jan 26 19:27:42 compute-0 network[184014]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 26 19:27:42 compute-0 nova_compute[183177]: 2026-01-26 19:27:42.711 183181 DEBUG nova.compute.resource_tracker [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:27:42 compute-0 nova_compute[183177]: 2026-01-26 19:27:42.713 183181 DEBUG nova.compute.resource_tracker [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:27:39 up 51 min,  0 user,  load average: 1.08, 0.90, 0.66\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:27:43 compute-0 nova_compute[183177]: 2026-01-26 19:27:43.316 183181 INFO nova.scheduler.client.report [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] [req-aeabc856-a433-420f-86b6-0eb30a5ca961] Created resource provider record via placement API for resource provider with UUID a47e311f-639f-4d60-b79d-85bbf53e2f35 and name compute-0.ctlplane.example.com.
Jan 26 19:27:43 compute-0 nova_compute[183177]: 2026-01-26 19:27:43.343 183181 DEBUG nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 26 19:27:43 compute-0 nova_compute[183177]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Jan 26 19:27:43 compute-0 nova_compute[183177]: 2026-01-26 19:27:43.344 183181 INFO nova.virt.libvirt.host [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] kernel doesn't support AMD SEV
Jan 26 19:27:43 compute-0 nova_compute[183177]: 2026-01-26 19:27:43.344 183181 DEBUG nova.compute.provider_tree [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 19:27:43 compute-0 nova_compute[183177]: 2026-01-26 19:27:43.345 183181 DEBUG nova.virt.libvirt.driver [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 19:27:43 compute-0 nova_compute[183177]: 2026-01-26 19:27:43.349 183181 DEBUG nova.virt.libvirt.driver [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Libvirt baseline CPU <cpu>
Jan 26 19:27:43 compute-0 nova_compute[183177]:   <arch>x86_64</arch>
Jan 26 19:27:43 compute-0 nova_compute[183177]:   <model>Nehalem</model>
Jan 26 19:27:43 compute-0 nova_compute[183177]:   <vendor>AMD</vendor>
Jan 26 19:27:43 compute-0 nova_compute[183177]:   <topology sockets="8" cores="1" threads="1"/>
Jan 26 19:27:43 compute-0 nova_compute[183177]:   <maxphysaddr mode="emulate" bits="40"/>
Jan 26 19:27:43 compute-0 nova_compute[183177]: </cpu>
Jan 26 19:27:43 compute-0 nova_compute[183177]:  _get_guest_baseline_cpu_features /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13545
Jan 26 19:27:43 compute-0 nova_compute[183177]: 2026-01-26 19:27:43.906 183181 DEBUG nova.scheduler.client.report [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Updated inventory for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Jan 26 19:27:43 compute-0 nova_compute[183177]: 2026-01-26 19:27:43.907 183181 DEBUG nova.compute.provider_tree [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Updating resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 19:27:43 compute-0 nova_compute[183177]: 2026-01-26 19:27:43.907 183181 DEBUG nova.compute.provider_tree [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 19:27:44 compute-0 nova_compute[183177]: 2026-01-26 19:27:44.103 183181 DEBUG nova.compute.provider_tree [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Updating resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 19:27:44 compute-0 nova_compute[183177]: 2026-01-26 19:27:44.614 183181 DEBUG nova.compute.resource_tracker [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:27:44 compute-0 nova_compute[183177]: 2026-01-26 19:27:44.614 183181 DEBUG oslo_concurrency.lockutils [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.898s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:27:44 compute-0 nova_compute[183177]: 2026-01-26 19:27:44.615 183181 DEBUG nova.service [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Jan 26 19:27:44 compute-0 nova_compute[183177]: 2026-01-26 19:27:44.736 183181 DEBUG nova.service [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Jan 26 19:27:44 compute-0 nova_compute[183177]: 2026-01-26 19:27:44.737 183181 DEBUG nova.servicegroup.drivers.db [None req-0f903225-cb3d-4521-aae4-fc5715b8a88d - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Jan 26 19:27:51 compute-0 sudo[184284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znepzquksuqddekcegcoljdscpgdlbnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455671.0158484-85-145762396597294/AnsiballZ_systemd_service.py'
Jan 26 19:27:51 compute-0 sudo[184284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:51 compute-0 python3.9[184286]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:27:51 compute-0 sudo[184284]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:52 compute-0 sudo[184437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjeouthtqxdzlkotyyrfranocxuuojry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455672.001313-105-13695366620699/AnsiballZ_file.py'
Jan 26 19:27:52 compute-0 sudo[184437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:52 compute-0 python3.9[184439]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:27:52 compute-0 sudo[184437]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:52 compute-0 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 19:27:53 compute-0 sudo[184590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrwycobgozvtovddutjtxnpqiyckpyuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455672.9001293-121-95882792635746/AnsiballZ_file.py'
Jan 26 19:27:53 compute-0 sudo[184590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:53 compute-0 python3.9[184592]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:27:53 compute-0 sudo[184590]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:54 compute-0 sudo[184742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjepknjhzqavycllbzppndnfekuypjte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455673.7789671-139-38586000214217/AnsiballZ_command.py'
Jan 26 19:27:54 compute-0 sudo[184742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:54 compute-0 python3.9[184744]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:27:54 compute-0 sudo[184742]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:55 compute-0 python3.9[184896]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 19:27:56 compute-0 sudo[185046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfxoqstspbthrfnyffgxlhahpcithrxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455675.8791342-175-159944165868946/AnsiballZ_systemd_service.py'
Jan 26 19:27:56 compute-0 sudo[185046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:56 compute-0 python3.9[185048]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 19:27:56 compute-0 systemd[1]: Reloading.
Jan 26 19:27:56 compute-0 systemd-rc-local-generator[185093]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:27:56 compute-0 systemd-sysv-generator[185099]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:27:56 compute-0 podman[185050]: 2026-01-26 19:27:56.834326794 +0000 UTC m=+0.184498233 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true)
Jan 26 19:27:56 compute-0 sudo[185046]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:57 compute-0 sudo[185260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyprjyqsykfjqmoyonxdauaxobhfxrgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455677.123107-191-87650765830951/AnsiballZ_command.py'
Jan 26 19:27:57 compute-0 sudo[185260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:57 compute-0 python3.9[185262]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:27:57 compute-0 sudo[185260]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:58 compute-0 sudo[185413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdcqxhwdpimvwfjuyovxvdbhnfpfotvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455677.9611032-209-31543854724796/AnsiballZ_file.py'
Jan 26 19:27:58 compute-0 sudo[185413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:27:58 compute-0 python3.9[185415]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:27:58 compute-0 sudo[185413]: pam_unix(sudo:session): session closed for user root
Jan 26 19:27:59 compute-0 python3.9[185565]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:28:00 compute-0 sudo[185717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eriqavdnliseloglkcbpfkqxkpncuwmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455679.700908-241-267876629841488/AnsiballZ_group.py'
Jan 26 19:28:00 compute-0 sudo[185717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:00 compute-0 python3.9[185719]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 26 19:28:00 compute-0 sudo[185717]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:01 compute-0 podman[185819]: 2026-01-26 19:28:01.318448299 +0000 UTC m=+0.061748742 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 19:28:01 compute-0 sudo[185888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoxtvrjixankybdsyqwerqrneuhkjtws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455680.8878195-263-247895921289616/AnsiballZ_getent.py'
Jan 26 19:28:01 compute-0 sudo[185888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:01 compute-0 python3.9[185890]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 26 19:28:01 compute-0 sudo[185888]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:02 compute-0 sudo[186041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxjsvaepyetyfejobsxzxrksnkvtisvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455681.7850718-279-104038389646847/AnsiballZ_group.py'
Jan 26 19:28:02 compute-0 sudo[186041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:02 compute-0 python3.9[186043]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 26 19:28:02 compute-0 groupadd[186044]: group added to /etc/group: name=ceilometer, GID=42405
Jan 26 19:28:02 compute-0 groupadd[186044]: group added to /etc/gshadow: name=ceilometer
Jan 26 19:28:02 compute-0 groupadd[186044]: new group: name=ceilometer, GID=42405
Jan 26 19:28:02 compute-0 sudo[186041]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:03 compute-0 sudo[186199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tugcehdrqecqrxdciyfffswzerfyjymp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455682.616648-295-75273255800120/AnsiballZ_user.py'
Jan 26 19:28:03 compute-0 sudo[186199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:03 compute-0 python3.9[186201]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 26 19:28:03 compute-0 useradd[186203]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Jan 26 19:28:03 compute-0 useradd[186203]: add 'ceilometer' to group 'libvirt'
Jan 26 19:28:03 compute-0 useradd[186203]: add 'ceilometer' to shadow group 'libvirt'
Jan 26 19:28:03 compute-0 sudo[186199]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:05 compute-0 python3.9[186359]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:28:05 compute-0 python3.9[186480]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769455684.5691864-347-142264595839513/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:06 compute-0 python3.9[186630]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:28:06 compute-0 sshd-session[186631]: Invalid user vyos from 193.32.162.151 port 44264
Jan 26 19:28:06 compute-0 sshd-session[186631]: Connection closed by invalid user vyos 193.32.162.151 port 44264 [preauth]
Jan 26 19:28:07 compute-0 python3.9[186753]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769455685.9857259-347-35990062173821/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:07 compute-0 python3.9[186903]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:28:08 compute-0 python3.9[187024]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769455687.2677288-347-135620884596333/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:09 compute-0 python3.9[187174]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:28:09 compute-0 python3.9[187326]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:28:10 compute-0 python3.9[187478]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:28:11 compute-0 python3.9[187599]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455690.1191478-465-81801784820692/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:28:12 compute-0 python3.9[187749]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:28:12 compute-0 python3.9[187870]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455691.5633216-465-131658982796184/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:28:13 compute-0 python3.9[188020]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:28:14 compute-0 python3.9[188141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455692.9571207-523-280615882496979/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:28:14 compute-0 python3.9[188291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:28:15 compute-0 python3.9[188412]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455694.444303-555-148503263675889/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:16 compute-0 python3.9[188562]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:28:17 compute-0 python3.9[188683]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455695.8496573-585-39331666991737/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:17 compute-0 python3.9[188833]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:28:18 compute-0 python3.9[188954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455697.2472029-615-174921873447575/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:18 compute-0 sudo[189104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svipuirzuocuxvpzqwqydpsptuoebnki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455698.6277897-645-242104435017520/AnsiballZ_file.py'
Jan 26 19:28:18 compute-0 sudo[189104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:19 compute-0 python3.9[189106]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:19 compute-0 sudo[189104]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:19 compute-0 sudo[189256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cflbvtvwduqcbpbyzmelmjvoatmhrnij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455699.390604-661-183274332256243/AnsiballZ_file.py'
Jan 26 19:28:19 compute-0 sudo[189256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:19 compute-0 python3.9[189258]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:19 compute-0 sudo[189256]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:20 compute-0 python3.9[189408]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:28:21 compute-0 python3.9[189560]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:28:22 compute-0 python3.9[189712]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:28:22 compute-0 sudo[189864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edakuzodjpcdludakxbjbdcchkmctjgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455702.4836593-725-181169268169650/AnsiballZ_file.py'
Jan 26 19:28:22 compute-0 sudo[189864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:22 compute-0 python3.9[189866]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:28:22 compute-0 sudo[189864]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:23 compute-0 sudo[190016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jovtqcwmvphyjsumelgzfzbfiemkltys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455703.2045996-741-230926566603363/AnsiballZ_systemd_service.py'
Jan 26 19:28:23 compute-0 sudo[190016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:23 compute-0 python3.9[190018]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:28:23 compute-0 systemd[1]: Reloading.
Jan 26 19:28:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:28:24.006 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:28:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:28:24.006 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:28:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:28:24.006 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:28:24 compute-0 systemd-rc-local-generator[190049]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:28:24 compute-0 systemd-sysv-generator[190053]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:28:24 compute-0 systemd[1]: Listening on Podman API Socket.
Jan 26 19:28:24 compute-0 sudo[190016]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:24 compute-0 sudo[190209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwguytrbjcjcdbrcgtvzrxyvtdohbzpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455704.5893035-759-3677622880856/AnsiballZ_stat.py'
Jan 26 19:28:24 compute-0 sudo[190209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:25 compute-0 python3.9[190211]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:28:25 compute-0 sudo[190209]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:25 compute-0 sudo[190332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcyvcsxzzclmvojgohcofdxfqlkzyscy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455704.5893035-759-3677622880856/AnsiballZ_copy.py'
Jan 26 19:28:25 compute-0 sudo[190332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:25 compute-0 python3.9[190334]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455704.5893035-759-3677622880856/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:28:25 compute-0 sudo[190332]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:26 compute-0 sudo[190484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyvokztvvsmqvnbtghmyiqdzwaqdzpkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455706.471254-801-239084913942787/AnsiballZ_file.py'
Jan 26 19:28:26 compute-0 sudo[190484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:27 compute-0 python3.9[190486]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:27 compute-0 sudo[190484]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:27 compute-0 podman[190487]: 2026-01-26 19:28:27.247277371 +0000 UTC m=+0.151357826 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 26 19:28:27 compute-0 sudo[190662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txxagufycnqceadywdamivopvxtpndev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455707.292763-817-89566349271606/AnsiballZ_file.py'
Jan 26 19:28:27 compute-0 sudo[190662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:27 compute-0 python3.9[190664]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:28:27 compute-0 sudo[190662]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:28 compute-0 python3.9[190814]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:31 compute-0 sudo[191235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhkkawmsldjhmaopkzplnjjqqbtzshoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455710.6673055-885-14037948020280/AnsiballZ_container_config_data.py'
Jan 26 19:28:31 compute-0 sudo[191235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:31 compute-0 python3.9[191237]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 26 19:28:31 compute-0 sudo[191235]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:32 compute-0 podman[191361]: 2026-01-26 19:28:32.161230261 +0000 UTC m=+0.053810444 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 19:28:32 compute-0 sudo[191404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgqxokxybpndhmhvrxtxfhzyvrgljbyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455711.7056432-907-112758491552812/AnsiballZ_container_config_hash.py'
Jan 26 19:28:32 compute-0 sudo[191404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:32 compute-0 python3.9[191408]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 19:28:32 compute-0 sudo[191404]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:33 compute-0 sudo[191558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtcykymgxillwlgwkwxmgdtwinwhkiim ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769455713.0714486-927-139550093480836/AnsiballZ_edpm_container_manage.py'
Jan 26 19:28:33 compute-0 sudo[191558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:33 compute-0 python3[191560]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 19:28:35 compute-0 podman[191573]: 2026-01-26 19:28:35.409591896 +0000 UTC m=+1.405817881 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 26 19:28:35 compute-0 podman[191668]: 2026-01-26 19:28:35.596796405 +0000 UTC m=+0.078402441 container create 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter)
Jan 26 19:28:35 compute-0 podman[191668]: 2026-01-26 19:28:35.558909791 +0000 UTC m=+0.040515887 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 26 19:28:35 compute-0 python3[191560]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 26 19:28:35 compute-0 sudo[191558]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:36 compute-0 sudo[191856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzkeahpopovqmlqbfgsajumwhbylnosp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455716.0452526-943-2059030270413/AnsiballZ_stat.py'
Jan 26 19:28:36 compute-0 sudo[191856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:36 compute-0 python3.9[191858]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:28:36 compute-0 sudo[191856]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:37 compute-0 sudo[192010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sasmebvkeewwfvxnnhqzdypwmtvdtnsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455716.9897723-961-266497540274405/AnsiballZ_file.py'
Jan 26 19:28:37 compute-0 sudo[192010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:37 compute-0 python3.9[192012]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:37 compute-0 sudo[192010]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:37 compute-0 sudo[192086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smptlvilnpknzwulpwmdvlwidrjjcwma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455716.9897723-961-266497540274405/AnsiballZ_stat.py'
Jan 26 19:28:37 compute-0 sudo[192086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:38 compute-0 python3.9[192088]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:28:38 compute-0 sudo[192086]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:38 compute-0 sudo[192237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otgrychnmctwiyawkgzvpwqmoeqshuci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455718.1096773-961-35034183978469/AnsiballZ_copy.py'
Jan 26 19:28:38 compute-0 sudo[192237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:38 compute-0 python3.9[192239]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769455718.1096773-961-35034183978469/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:38 compute-0 sudo[192237]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:39 compute-0 sudo[192313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aexmrwzutujfuupohuturkonbjmhilbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455718.1096773-961-35034183978469/AnsiballZ_systemd.py'
Jan 26 19:28:39 compute-0 sudo[192313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:39 compute-0 python3.9[192315]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 19:28:39 compute-0 systemd[1]: Reloading.
Jan 26 19:28:39 compute-0 systemd-rc-local-generator[192343]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:28:39 compute-0 systemd-sysv-generator[192346]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:28:40 compute-0 sudo[192313]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:40 compute-0 sudo[192424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axafbsaelqnhahxlmrzrbmarqvaxrgsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455718.1096773-961-35034183978469/AnsiballZ_systemd.py'
Jan 26 19:28:40 compute-0 sudo[192424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:40 compute-0 python3.9[192426]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:28:40 compute-0 nova_compute[183177]: 2026-01-26 19:28:40.738 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:28:40 compute-0 nova_compute[183177]: 2026-01-26 19:28:40.739 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:28:40 compute-0 nova_compute[183177]: 2026-01-26 19:28:40.739 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:28:40 compute-0 nova_compute[183177]: 2026-01-26 19:28:40.740 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:28:40 compute-0 nova_compute[183177]: 2026-01-26 19:28:40.740 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:28:40 compute-0 nova_compute[183177]: 2026-01-26 19:28:40.740 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:28:40 compute-0 nova_compute[183177]: 2026-01-26 19:28:40.740 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:28:40 compute-0 systemd[1]: Reloading.
Jan 26 19:28:40 compute-0 systemd-rc-local-generator[192459]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:28:40 compute-0 systemd-sysv-generator[192463]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:28:41 compute-0 systemd[1]: Starting podman_exporter container...
Jan 26 19:28:41 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:28:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c7211b049715bf3e44e6694ea39820829dfa12fc019ef020915e4dad62cb3e9/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 26 19:28:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c7211b049715bf3e44e6694ea39820829dfa12fc019ef020915e4dad62cb3e9/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 26 19:28:41 compute-0 nova_compute[183177]: 2026-01-26 19:28:41.251 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:28:41 compute-0 nova_compute[183177]: 2026-01-26 19:28:41.253 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:28:41 compute-0 nova_compute[183177]: 2026-01-26 19:28:41.253 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:28:41 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba.
Jan 26 19:28:41 compute-0 podman[192467]: 2026-01-26 19:28:41.28670588 +0000 UTC m=+0.156231476 container init 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 19:28:41 compute-0 podman_exporter[192482]: ts=2026-01-26T19:28:41.308Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 26 19:28:41 compute-0 podman_exporter[192482]: ts=2026-01-26T19:28:41.308Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 26 19:28:41 compute-0 podman_exporter[192482]: ts=2026-01-26T19:28:41.309Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 26 19:28:41 compute-0 podman_exporter[192482]: ts=2026-01-26T19:28:41.309Z caller=handler.go:105 level=info collector=container
Jan 26 19:28:41 compute-0 podman[192467]: 2026-01-26 19:28:41.327938156 +0000 UTC m=+0.197463702 container start 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:28:41 compute-0 podman[192467]: podman_exporter
Jan 26 19:28:41 compute-0 systemd[1]: Starting Podman API Service...
Jan 26 19:28:41 compute-0 systemd[1]: Started podman_exporter container.
Jan 26 19:28:41 compute-0 systemd[1]: Started Podman API Service.
Jan 26 19:28:41 compute-0 sudo[192424]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:41 compute-0 podman[192499]: time="2026-01-26T19:28:41Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 26 19:28:41 compute-0 podman[192499]: time="2026-01-26T19:28:41Z" level=info msg="Setting parallel job count to 25"
Jan 26 19:28:41 compute-0 podman[192499]: time="2026-01-26T19:28:41Z" level=info msg="Using sqlite as database backend"
Jan 26 19:28:41 compute-0 podman[192499]: time="2026-01-26T19:28:41Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 26 19:28:41 compute-0 podman[192499]: time="2026-01-26T19:28:41Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 26 19:28:41 compute-0 podman[192499]: time="2026-01-26T19:28:41Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 26 19:28:41 compute-0 podman[192499]: @ - - [26/Jan/2026:19:28:41 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 26 19:28:41 compute-0 podman[192499]: time="2026-01-26T19:28:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:28:41 compute-0 podman[192493]: 2026-01-26 19:28:41.409877874 +0000 UTC m=+0.073642360 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:28:41 compute-0 systemd[1]: 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba-3eb946586bc08ac7.service: Main process exited, code=exited, status=1/FAILURE
Jan 26 19:28:41 compute-0 systemd[1]: 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba-3eb946586bc08ac7.service: Failed with result 'exit-code'.
Jan 26 19:28:41 compute-0 podman[192499]: @ - - [26/Jan/2026:19:28:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 12117 "" "Go-http-client/1.1"
Jan 26 19:28:41 compute-0 podman_exporter[192482]: ts=2026-01-26T19:28:41.421Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 26 19:28:41 compute-0 podman_exporter[192482]: ts=2026-01-26T19:28:41.421Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 26 19:28:41 compute-0 podman_exporter[192482]: ts=2026-01-26T19:28:41.422Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 26 19:28:41 compute-0 nova_compute[183177]: 2026-01-26 19:28:41.770 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:28:41 compute-0 nova_compute[183177]: 2026-01-26 19:28:41.770 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:28:41 compute-0 nova_compute[183177]: 2026-01-26 19:28:41.770 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:28:41 compute-0 nova_compute[183177]: 2026-01-26 19:28:41.771 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:28:41 compute-0 nova_compute[183177]: 2026-01-26 19:28:41.957 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:28:41 compute-0 nova_compute[183177]: 2026-01-26 19:28:41.958 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:28:41 compute-0 nova_compute[183177]: 2026-01-26 19:28:41.976 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:28:41 compute-0 nova_compute[183177]: 2026-01-26 19:28:41.977 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6085MB free_disk=73.25139617919922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:28:41 compute-0 nova_compute[183177]: 2026-01-26 19:28:41.977 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:28:41 compute-0 nova_compute[183177]: 2026-01-26 19:28:41.977 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:28:42 compute-0 python3.9[192677]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 19:28:43 compute-0 nova_compute[183177]: 2026-01-26 19:28:43.036 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:28:43 compute-0 nova_compute[183177]: 2026-01-26 19:28:43.036 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:28:41 up 53 min,  0 user,  load average: 1.47, 1.07, 0.74\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:28:43 compute-0 nova_compute[183177]: 2026-01-26 19:28:43.067 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:28:43 compute-0 sudo[192827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqpttbadsvfdljklonumtlivxlhsxvck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455723.163654-1051-275216322142760/AnsiballZ_stat.py'
Jan 26 19:28:43 compute-0 sudo[192827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:43 compute-0 nova_compute[183177]: 2026-01-26 19:28:43.576 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:28:43 compute-0 python3.9[192829]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:28:43 compute-0 sudo[192827]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:44 compute-0 nova_compute[183177]: 2026-01-26 19:28:44.087 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:28:44 compute-0 nova_compute[183177]: 2026-01-26 19:28:44.088 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:28:44 compute-0 nova_compute[183177]: 2026-01-26 19:28:44.088 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:28:44 compute-0 sudo[192952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jczghqssweounobroxfytnkwrjwjnmum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455723.163654-1051-275216322142760/AnsiballZ_copy.py'
Jan 26 19:28:44 compute-0 sudo[192952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:44 compute-0 python3.9[192954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455723.163654-1051-275216322142760/.source.yaml _original_basename=.pcpav41o follow=False checksum=09d0f47d95035a278481d4c39a26474c827de280 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:44 compute-0 sudo[192952]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:44 compute-0 sudo[193104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfjsirtpmrddcpvxsjcxecwuhrkxyuzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455724.6452127-1081-209291867297432/AnsiballZ_stat.py'
Jan 26 19:28:44 compute-0 sudo[193104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:45 compute-0 python3.9[193106]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:28:45 compute-0 sudo[193104]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:45 compute-0 sudo[193227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glitdnavlnlgjayfdrnsattxmkvsjnqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455724.6452127-1081-209291867297432/AnsiballZ_copy.py'
Jan 26 19:28:45 compute-0 sudo[193227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:45 compute-0 python3.9[193229]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769455724.6452127-1081-209291867297432/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:28:45 compute-0 sudo[193227]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:46 compute-0 sudo[193379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvjoyxwipxonlvgfmbssmnbcvqjlspok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455726.5424008-1123-26050023502721/AnsiballZ_file.py'
Jan 26 19:28:46 compute-0 sudo[193379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:47 compute-0 python3.9[193381]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:47 compute-0 sudo[193379]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:47 compute-0 sudo[193531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjdogxpclssoqqywbpaooxssweiqfzfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455727.264319-1139-215225406859325/AnsiballZ_file.py'
Jan 26 19:28:47 compute-0 sudo[193531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:47 compute-0 python3.9[193533]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 26 19:28:47 compute-0 sudo[193531]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:48 compute-0 python3.9[193683]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:50 compute-0 sudo[194104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqmfnevprrfvguorwbaalelgjmefudhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455730.4282165-1207-131628060978385/AnsiballZ_container_config_data.py'
Jan 26 19:28:50 compute-0 sudo[194104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:50 compute-0 python3.9[194106]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 26 19:28:50 compute-0 sudo[194104]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:51 compute-0 sudo[194256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoumhtsbvgdkrcqzjyveufqdajgdxmnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455731.474078-1229-42824233547059/AnsiballZ_container_config_hash.py'
Jan 26 19:28:51 compute-0 sudo[194256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:51 compute-0 python3.9[194258]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 26 19:28:52 compute-0 sudo[194256]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:52 compute-0 sudo[194408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clhexzmhfhmpntczknohnilbngrxhtzb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769455732.445276-1249-123238153901146/AnsiballZ_edpm_container_manage.py'
Jan 26 19:28:52 compute-0 sudo[194408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:53 compute-0 python3[194410]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 26 19:28:55 compute-0 podman[194425]: 2026-01-26 19:28:55.89838944 +0000 UTC m=+2.679108298 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 26 19:28:56 compute-0 podman[194523]: 2026-01-26 19:28:56.040904808 +0000 UTC m=+0.047397627 container create b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter)
Jan 26 19:28:56 compute-0 podman[194523]: 2026-01-26 19:28:56.017875494 +0000 UTC m=+0.024368333 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 26 19:28:56 compute-0 python3[194410]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 26 19:28:56 compute-0 sudo[194408]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:56 compute-0 sudo[194711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbyuxhqguyryfvyvjplgnfwqgwwoxnbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455736.4019246-1265-127648530421229/AnsiballZ_stat.py'
Jan 26 19:28:56 compute-0 sudo[194711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:56 compute-0 python3.9[194713]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:28:56 compute-0 sudo[194711]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:57 compute-0 sudo[194878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovxlnbijgmkwpzhctyoxskkhydajlmfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455737.2735584-1283-124258224654387/AnsiballZ_file.py'
Jan 26 19:28:57 compute-0 sudo[194878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:57 compute-0 podman[194839]: 2026-01-26 19:28:57.719476424 +0000 UTC m=+0.131454384 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260120, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 19:28:57 compute-0 python3.9[194884]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:57 compute-0 sudo[194878]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:58 compute-0 sudo[194967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqkjksohmzomutzhydhtboyvbwoqyiqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455737.2735584-1283-124258224654387/AnsiballZ_stat.py'
Jan 26 19:28:58 compute-0 sudo[194967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:58 compute-0 python3.9[194969]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:28:58 compute-0 sudo[194967]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:59 compute-0 sudo[195118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xriomwptkctksyndugpqjdffzfrnbtnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455738.4606054-1283-191025373617635/AnsiballZ_copy.py'
Jan 26 19:28:59 compute-0 sudo[195118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:59 compute-0 python3.9[195120]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769455738.4606054-1283-191025373617635/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:28:59 compute-0 sudo[195118]: pam_unix(sudo:session): session closed for user root
Jan 26 19:28:59 compute-0 sudo[195194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcabwbvmvkiyfuthszqxszefqfgitxmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455738.4606054-1283-191025373617635/AnsiballZ_systemd.py'
Jan 26 19:28:59 compute-0 sudo[195194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:28:59 compute-0 python3.9[195196]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 26 19:28:59 compute-0 systemd[1]: Reloading.
Jan 26 19:28:59 compute-0 systemd-rc-local-generator[195215]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:28:59 compute-0 systemd-sysv-generator[195222]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:29:00 compute-0 sudo[195194]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:00 compute-0 sudo[195305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaczphcfrhxhhanlxbybuqbnubijjjgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455738.4606054-1283-191025373617635/AnsiballZ_systemd.py'
Jan 26 19:29:00 compute-0 sudo[195305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:00 compute-0 python3.9[195307]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:29:00 compute-0 systemd[1]: Reloading.
Jan 26 19:29:00 compute-0 systemd-rc-local-generator[195339]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:29:01 compute-0 systemd-sysv-generator[195344]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:29:01 compute-0 systemd[1]: Starting openstack_network_exporter container...
Jan 26 19:29:01 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:29:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d882b62d80306dfb968ac41ea4df3dbd3143fef2c737a71a45faddf260e4f256/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 26 19:29:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d882b62d80306dfb968ac41ea4df3dbd3143fef2c737a71a45faddf260e4f256/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 26 19:29:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d882b62d80306dfb968ac41ea4df3dbd3143fef2c737a71a45faddf260e4f256/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 26 19:29:01 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b.
Jan 26 19:29:01 compute-0 podman[195348]: 2026-01-26 19:29:01.404665446 +0000 UTC m=+0.175993081 container init b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Jan 26 19:29:01 compute-0 openstack_network_exporter[195363]: INFO    19:29:01 main.go:48: registering *bridge.Collector
Jan 26 19:29:01 compute-0 openstack_network_exporter[195363]: INFO    19:29:01 main.go:48: registering *coverage.Collector
Jan 26 19:29:01 compute-0 openstack_network_exporter[195363]: INFO    19:29:01 main.go:48: registering *datapath.Collector
Jan 26 19:29:01 compute-0 openstack_network_exporter[195363]: INFO    19:29:01 main.go:48: registering *iface.Collector
Jan 26 19:29:01 compute-0 openstack_network_exporter[195363]: INFO    19:29:01 main.go:48: registering *memory.Collector
Jan 26 19:29:01 compute-0 openstack_network_exporter[195363]: INFO    19:29:01 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 26 19:29:01 compute-0 openstack_network_exporter[195363]: INFO    19:29:01 main.go:48: registering *ovn.Collector
Jan 26 19:29:01 compute-0 openstack_network_exporter[195363]: INFO    19:29:01 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 26 19:29:01 compute-0 openstack_network_exporter[195363]: INFO    19:29:01 main.go:48: registering *pmd_perf.Collector
Jan 26 19:29:01 compute-0 openstack_network_exporter[195363]: INFO    19:29:01 main.go:48: registering *pmd_rxq.Collector
Jan 26 19:29:01 compute-0 openstack_network_exporter[195363]: INFO    19:29:01 main.go:48: registering *vswitch.Collector
Jan 26 19:29:01 compute-0 openstack_network_exporter[195363]: NOTICE  19:29:01 main.go:76: listening on https://:9105/metrics
Jan 26 19:29:01 compute-0 podman[195348]: 2026-01-26 19:29:01.435472815 +0000 UTC m=+0.206800460 container start b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., version=9.6, distribution-scope=public)
Jan 26 19:29:01 compute-0 podman[195348]: openstack_network_exporter
Jan 26 19:29:01 compute-0 systemd[1]: Started openstack_network_exporter container.
Jan 26 19:29:01 compute-0 sudo[195305]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:01 compute-0 podman[195368]: 2026-01-26 19:29:01.541640671 +0000 UTC m=+0.092001997 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 19:29:02 compute-0 podman[195547]: 2026-01-26 19:29:02.313251544 +0000 UTC m=+0.061547657 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 19:29:02 compute-0 python3.9[195546]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 26 19:29:03 compute-0 sudo[195716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loyvpcretovxswfwetazieesefufjpfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455743.0371153-1373-118238379069276/AnsiballZ_stat.py'
Jan 26 19:29:03 compute-0 sudo[195716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:03 compute-0 python3.9[195718]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:29:03 compute-0 sudo[195716]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:04 compute-0 sudo[195841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgpfhmzezmaunpwxocawdtdsfynecvzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455743.0371153-1373-118238379069276/AnsiballZ_copy.py'
Jan 26 19:29:04 compute-0 sudo[195841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:04 compute-0 python3.9[195843]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455743.0371153-1373-118238379069276/.source.yaml _original_basename=.xutn_0x_ follow=False checksum=e6c22eec0ba5ae630d6617517c690c7967e85fe9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:04 compute-0 sudo[195841]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:04 compute-0 sudo[195993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxwfvbqtmqfgbvuliwcknuxtvfmdikrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455744.623498-1403-65299509116958/AnsiballZ_find.py'
Jan 26 19:29:04 compute-0 sudo[195993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:05 compute-0 python3.9[195995]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 26 19:29:05 compute-0 sudo[195993]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:06 compute-0 sudo[196145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfuxspavkvgtzmgqihxuamacxjnzqwzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455745.6426787-1422-211474865909287/AnsiballZ_podman_container_info.py'
Jan 26 19:29:06 compute-0 sudo[196145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:06 compute-0 python3.9[196147]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 26 19:29:06 compute-0 sudo[196145]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:07 compute-0 sudo[196309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qefktxwuyshhyvaicfpjlosholaqpgcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455746.868659-1430-108997022432374/AnsiballZ_podman_container_exec.py'
Jan 26 19:29:07 compute-0 sudo[196309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:07 compute-0 python3.9[196311]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 19:29:07 compute-0 systemd[1]: Started libpod-conmon-790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172.scope.
Jan 26 19:29:07 compute-0 podman[196312]: 2026-01-26 19:29:07.822037338 +0000 UTC m=+0.128758679 container exec 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 19:29:07 compute-0 podman[196312]: 2026-01-26 19:29:07.857717182 +0000 UTC m=+0.164438523 container exec_died 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Jan 26 19:29:07 compute-0 systemd[1]: libpod-conmon-790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172.scope: Deactivated successfully.
Jan 26 19:29:07 compute-0 sudo[196309]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:08 compute-0 sudo[196496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqejebehnaizuywlhshiihzoiaxkryuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455748.130469-1438-243526734127901/AnsiballZ_podman_container_exec.py'
Jan 26 19:29:08 compute-0 sudo[196496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:08 compute-0 python3.9[196498]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 19:29:08 compute-0 systemd[1]: Started libpod-conmon-790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172.scope.
Jan 26 19:29:08 compute-0 podman[196499]: 2026-01-26 19:29:08.913510236 +0000 UTC m=+0.099771470 container exec 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 19:29:08 compute-0 podman[196499]: 2026-01-26 19:29:08.946992139 +0000 UTC m=+0.133253383 container exec_died 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Jan 26 19:29:08 compute-0 systemd[1]: libpod-conmon-790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172.scope: Deactivated successfully.
Jan 26 19:29:08 compute-0 sudo[196496]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:09 compute-0 sudo[196681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeklyweufuffczxwlhqdspargpxzkvjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455749.179852-1446-248603469448309/AnsiballZ_file.py'
Jan 26 19:29:09 compute-0 sudo[196681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:09 compute-0 python3.9[196683]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:09 compute-0 sudo[196681]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:10 compute-0 sudo[196833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsrboocemshssbelqtatwtthocyzboaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455750.0777514-1455-96724799594349/AnsiballZ_podman_container_info.py'
Jan 26 19:29:10 compute-0 sudo[196833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:10 compute-0 python3.9[196835]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 26 19:29:10 compute-0 sudo[196833]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:11 compute-0 sudo[196998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygcvhgctrfyzrdmfwwzdonrrfadmuomo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455750.9422467-1463-205676501071793/AnsiballZ_podman_container_exec.py'
Jan 26 19:29:11 compute-0 sudo[196998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:11 compute-0 python3.9[197000]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 19:29:11 compute-0 systemd[1]: Started libpod-conmon-c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6.scope.
Jan 26 19:29:11 compute-0 podman[197001]: 2026-01-26 19:29:11.623459253 +0000 UTC m=+0.103396431 container exec c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:29:11 compute-0 podman[197001]: 2026-01-26 19:29:11.656487403 +0000 UTC m=+0.136424611 container exec_died c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:29:11 compute-0 systemd[1]: libpod-conmon-c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6.scope: Deactivated successfully.
Jan 26 19:29:11 compute-0 sudo[196998]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:11 compute-0 podman[197017]: 2026-01-26 19:29:11.709934066 +0000 UTC m=+0.080043746 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 19:29:12 compute-0 sudo[197203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxmmsdqepexkiyznoklrzufeckqsulnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455751.8893096-1471-130751771716058/AnsiballZ_podman_container_exec.py'
Jan 26 19:29:12 compute-0 sudo[197203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:12 compute-0 python3.9[197205]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 19:29:12 compute-0 systemd[1]: Started libpod-conmon-c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6.scope.
Jan 26 19:29:12 compute-0 podman[197206]: 2026-01-26 19:29:12.682122466 +0000 UTC m=+0.116497361 container exec c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 26 19:29:12 compute-0 podman[197206]: 2026-01-26 19:29:12.716432592 +0000 UTC m=+0.150807387 container exec_died c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:29:12 compute-0 sudo[197203]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:12 compute-0 systemd[1]: libpod-conmon-c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6.scope: Deactivated successfully.
Jan 26 19:29:13 compute-0 sudo[197388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wygaacyncjrzixyryoxcxdrrhbsrowov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455753.0070918-1479-241238095894938/AnsiballZ_file.py'
Jan 26 19:29:13 compute-0 sudo[197388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:13 compute-0 python3.9[197390]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:13 compute-0 sudo[197388]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:14 compute-0 sudo[197540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irjnechtjwkkrliqcelqskgoboyrszdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455753.9108524-1488-240939698144723/AnsiballZ_podman_container_info.py'
Jan 26 19:29:14 compute-0 sudo[197540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:14 compute-0 python3.9[197542]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 26 19:29:14 compute-0 sudo[197540]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:15 compute-0 sudo[197705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czwaiufkuylanaqlavxnykdoobamazxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455754.8682935-1496-116646467342371/AnsiballZ_podman_container_exec.py'
Jan 26 19:29:15 compute-0 sudo[197705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:15 compute-0 python3.9[197707]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 19:29:15 compute-0 systemd[1]: Started libpod-conmon-905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba.scope.
Jan 26 19:29:15 compute-0 podman[197708]: 2026-01-26 19:29:15.552457763 +0000 UTC m=+0.095970085 container exec 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:29:15 compute-0 podman[197708]: 2026-01-26 19:29:15.588574819 +0000 UTC m=+0.132087151 container exec_died 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 19:29:15 compute-0 systemd[1]: libpod-conmon-905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba.scope: Deactivated successfully.
Jan 26 19:29:15 compute-0 sudo[197705]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:16 compute-0 sudo[197890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfnpjakyldurkyghwbpkfffwiveiechy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455755.878569-1504-74647103843250/AnsiballZ_podman_container_exec.py'
Jan 26 19:29:16 compute-0 sudo[197890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:16 compute-0 python3.9[197892]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 19:29:16 compute-0 systemd[1]: Started libpod-conmon-905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba.scope.
Jan 26 19:29:16 compute-0 podman[197893]: 2026-01-26 19:29:16.511241325 +0000 UTC m=+0.075455171 container exec 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 19:29:16 compute-0 podman[197893]: 2026-01-26 19:29:16.542716303 +0000 UTC m=+0.106930139 container exec_died 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 19:29:16 compute-0 sudo[197890]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:16 compute-0 systemd[1]: libpod-conmon-905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba.scope: Deactivated successfully.
Jan 26 19:29:17 compute-0 sudo[198075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cykafmtmdiktsnhqwgmazclvntpcsnek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455756.7896757-1512-105062247583094/AnsiballZ_file.py'
Jan 26 19:29:17 compute-0 sudo[198075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:17 compute-0 python3.9[198077]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:17 compute-0 sudo[198075]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:18 compute-0 sudo[198227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqjabrallmnomqbvpqwddmxntlhnpiuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455757.6432557-1521-116420689496090/AnsiballZ_podman_container_info.py'
Jan 26 19:29:18 compute-0 sudo[198227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:18 compute-0 python3.9[198229]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 26 19:29:18 compute-0 sudo[198227]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:18 compute-0 sudo[198392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nivutpbtthpukibfwfahsslxoczsqffz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455758.6222088-1529-145412150764212/AnsiballZ_podman_container_exec.py'
Jan 26 19:29:18 compute-0 sudo[198392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:19 compute-0 python3.9[198394]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 19:29:19 compute-0 systemd[1]: Started libpod-conmon-b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b.scope.
Jan 26 19:29:19 compute-0 podman[198395]: 2026-01-26 19:29:19.315810209 +0000 UTC m=+0.085739793 container exec b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Jan 26 19:29:19 compute-0 podman[198395]: 2026-01-26 19:29:19.346346531 +0000 UTC m=+0.116276075 container exec_died b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, name=ubi9-minimal, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 19:29:19 compute-0 systemd[1]: libpod-conmon-b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b.scope: Deactivated successfully.
Jan 26 19:29:19 compute-0 sudo[198392]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:20 compute-0 sudo[198575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucpetiaruvqqvzyzfrgsyqqknchgcthy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455759.6249616-1537-46817929947370/AnsiballZ_podman_container_exec.py'
Jan 26 19:29:20 compute-0 sudo[198575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:20 compute-0 python3.9[198577]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 26 19:29:20 compute-0 systemd[1]: Started libpod-conmon-b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b.scope.
Jan 26 19:29:20 compute-0 podman[198578]: 2026-01-26 19:29:20.358823522 +0000 UTC m=+0.089624881 container exec b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 26 19:29:20 compute-0 podman[198578]: 2026-01-26 19:29:20.393666932 +0000 UTC m=+0.124468291 container exec_died b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 19:29:20 compute-0 systemd[1]: libpod-conmon-b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b.scope: Deactivated successfully.
Jan 26 19:29:20 compute-0 sudo[198575]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:21 compute-0 sudo[198758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzkduhhvinodyarakmtnupwgzlxhxhfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455760.6632042-1545-218738738779310/AnsiballZ_file.py'
Jan 26 19:29:21 compute-0 sudo[198758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:21 compute-0 python3.9[198760]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:21 compute-0 sudo[198758]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:29:24.010 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:29:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:29:24.011 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:29:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:29:24.011 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:29:28 compute-0 podman[198786]: 2026-01-26 19:29:28.441416672 +0000 UTC m=+0.179546668 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4)
Jan 26 19:29:32 compute-0 podman[198814]: 2026-01-26 19:29:32.353468845 +0000 UTC m=+0.086364041 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, config_id=openstack_network_exporter, distribution-scope=public)
Jan 26 19:29:32 compute-0 podman[198835]: 2026-01-26 19:29:32.466633004 +0000 UTC m=+0.081601511 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 26 19:29:35 compute-0 sudo[198981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlqjwpwkbclsglyxukpiuwoiqzndveiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455775.2908213-1687-275559126817289/AnsiballZ_file.py'
Jan 26 19:29:35 compute-0 sudo[198981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:35 compute-0 python3.9[198983]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:35 compute-0 sudo[198981]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:36 compute-0 sudo[199133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onxkmyxlvpjkbttovgostsseveeehloz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455776.0991268-1703-168768699999569/AnsiballZ_stat.py'
Jan 26 19:29:36 compute-0 sudo[199133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:36 compute-0 nova_compute[183177]: 2026-01-26 19:29:36.500 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:29:36 compute-0 nova_compute[183177]: 2026-01-26 19:29:36.501 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:29:36 compute-0 python3.9[199135]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:29:36 compute-0 sudo[199133]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.019 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.020 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.020 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.020 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.021 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.021 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.021 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.022 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:29:37 compute-0 sudo[199256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-choesllyjsrtzwlfbanjcducgcchtann ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455776.0991268-1703-168768699999569/AnsiballZ_copy.py'
Jan 26 19:29:37 compute-0 sudo[199256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:37 compute-0 python3.9[199258]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455776.0991268-1703-168768699999569/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:37 compute-0 sudo[199256]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.547 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.547 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.547 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.548 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.679 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.680 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.694 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.695 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5990MB free_disk=73.1373519897461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.695 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:29:37 compute-0 nova_compute[183177]: 2026-01-26 19:29:37.696 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:29:37 compute-0 sudo[199409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcswypspvtpponzmlqnupsywevkhiwfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455777.5984948-1735-151601593166263/AnsiballZ_file.py'
Jan 26 19:29:37 compute-0 sudo[199409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:38 compute-0 python3.9[199411]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:38 compute-0 sudo[199409]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:38 compute-0 sudo[199561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaedxhsecciyulavleidarlcgoouspey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455778.3476872-1751-136676429134756/AnsiballZ_stat.py'
Jan 26 19:29:38 compute-0 sudo[199561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:38 compute-0 nova_compute[183177]: 2026-01-26 19:29:38.744 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:29:38 compute-0 nova_compute[183177]: 2026-01-26 19:29:38.744 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:29:37 up 53 min,  0 user,  load average: 1.19, 1.05, 0.75\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:29:38 compute-0 nova_compute[183177]: 2026-01-26 19:29:38.768 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:29:38 compute-0 python3.9[199563]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:29:38 compute-0 sudo[199561]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:39 compute-0 sudo[199639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esivvjfdlvtvovoghrhdlgitubitpkgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455778.3476872-1751-136676429134756/AnsiballZ_file.py'
Jan 26 19:29:39 compute-0 sudo[199639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:39 compute-0 nova_compute[183177]: 2026-01-26 19:29:39.276 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:29:39 compute-0 python3.9[199641]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:39 compute-0 sudo[199639]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:39 compute-0 nova_compute[183177]: 2026-01-26 19:29:39.788 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:29:39 compute-0 nova_compute[183177]: 2026-01-26 19:29:39.788 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:29:40 compute-0 sudo[199791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yatqajdiehfebzpyhvqggojadxeoazna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455779.697937-1775-56935000842844/AnsiballZ_stat.py'
Jan 26 19:29:40 compute-0 sudo[199791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:40 compute-0 python3.9[199793]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:29:40 compute-0 sudo[199791]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:40 compute-0 sudo[199869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zurfyxkdjfjrgolvajiiwudjbellyuwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455779.697937-1775-56935000842844/AnsiballZ_file.py'
Jan 26 19:29:40 compute-0 sudo[199869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:40 compute-0 python3.9[199871]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.mkh1jfd0 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:40 compute-0 sudo[199869]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:41 compute-0 sudo[200021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwntcetjumluumcfjicxdkifufqtsgge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455781.1222498-1799-45301648615306/AnsiballZ_stat.py'
Jan 26 19:29:41 compute-0 sudo[200021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:41 compute-0 python3.9[200023]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:29:41 compute-0 sudo[200021]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:42 compute-0 podman[200073]: 2026-01-26 19:29:42.047291176 +0000 UTC m=+0.069472142 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 19:29:42 compute-0 sudo[200118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quvwzajlmubcltcwabafrbkwncanscrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455781.1222498-1799-45301648615306/AnsiballZ_file.py'
Jan 26 19:29:42 compute-0 sudo[200118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:42 compute-0 python3.9[200125]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:42 compute-0 sudo[200118]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:42 compute-0 sudo[200275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpnzptuxvgcvbxmtkuhxkexhmgyokzys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455782.5543282-1825-11266286600148/AnsiballZ_command.py'
Jan 26 19:29:42 compute-0 sudo[200275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:43 compute-0 python3.9[200277]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:29:43 compute-0 sudo[200275]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:43 compute-0 sudo[200428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnsqnbedlkfwbowwiliqmljoeadndmtg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769455783.3419282-1841-40621204923807/AnsiballZ_edpm_nftables_from_files.py'
Jan 26 19:29:43 compute-0 sudo[200428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:44 compute-0 python3[200430]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 26 19:29:44 compute-0 sudo[200428]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:44 compute-0 sudo[200580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezhyglhsjmdamavzrsjdiexjkdngsqwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455784.2371564-1857-259306792933990/AnsiballZ_stat.py'
Jan 26 19:29:44 compute-0 sudo[200580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:44 compute-0 python3.9[200582]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:29:44 compute-0 sudo[200580]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:45 compute-0 sudo[200658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smokbjkyikteroheokwyebugngehcgxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455784.2371564-1857-259306792933990/AnsiballZ_file.py'
Jan 26 19:29:45 compute-0 sudo[200658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:45 compute-0 python3.9[200660]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:45 compute-0 sudo[200658]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:46 compute-0 sudo[200810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpttotfttliketxziuwymznyoxkngwse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455785.736032-1881-254623816542054/AnsiballZ_stat.py'
Jan 26 19:29:46 compute-0 sudo[200810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:46 compute-0 python3.9[200812]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:29:46 compute-0 sudo[200810]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:46 compute-0 sudo[200888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktcfvlnatmacxwafijdwfdsmsjxbikak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455785.736032-1881-254623816542054/AnsiballZ_file.py'
Jan 26 19:29:46 compute-0 sudo[200888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:47 compute-0 python3.9[200890]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:47 compute-0 sudo[200888]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:47 compute-0 sudo[201040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmpqhpxdkbxhathgrfjcvlkwfhtnrhfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455787.2209802-1905-122796289587670/AnsiballZ_stat.py'
Jan 26 19:29:47 compute-0 sudo[201040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:47 compute-0 python3.9[201042]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:29:47 compute-0 sudo[201040]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:48 compute-0 sudo[201118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oexbngbjsufelzjevegsqtsitsshpmpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455787.2209802-1905-122796289587670/AnsiballZ_file.py'
Jan 26 19:29:48 compute-0 sudo[201118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:48 compute-0 python3.9[201120]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:48 compute-0 sudo[201118]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:48 compute-0 sudo[201270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxmbytjrrgzuxwedpvzqqrzozyaihjzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455788.5583231-1929-252291979298752/AnsiballZ_stat.py'
Jan 26 19:29:48 compute-0 sudo[201270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:49 compute-0 python3.9[201272]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:29:49 compute-0 sudo[201270]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:49 compute-0 sudo[201348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmoacbgujwfcdwdpkxkateukgfnikozr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455788.5583231-1929-252291979298752/AnsiballZ_file.py'
Jan 26 19:29:49 compute-0 sudo[201348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:49 compute-0 python3.9[201350]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:49 compute-0 sudo[201348]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:50 compute-0 sudo[201500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrwehdctxeaqsguljtatlqctonopimki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455789.8763542-1953-161528177819683/AnsiballZ_stat.py'
Jan 26 19:29:50 compute-0 sudo[201500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:50 compute-0 python3.9[201502]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 26 19:29:50 compute-0 sudo[201500]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:50 compute-0 sudo[201625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygscbfauvzfumogbzgqzsdhvnudutkds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455789.8763542-1953-161528177819683/AnsiballZ_copy.py'
Jan 26 19:29:50 compute-0 sudo[201625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:51 compute-0 python3.9[201627]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769455789.8763542-1953-161528177819683/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:51 compute-0 sudo[201625]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:51 compute-0 sudo[201777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ackzxutnvyliltbbadzoqkmeshzpizrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455791.4094312-1983-182111759811995/AnsiballZ_file.py'
Jan 26 19:29:51 compute-0 sudo[201777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:52 compute-0 python3.9[201779]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:52 compute-0 sudo[201777]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:52 compute-0 sudo[201929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbwjeybwrqquvxrcjodduoqebywslqgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455792.2281024-1999-99241012161093/AnsiballZ_command.py'
Jan 26 19:29:52 compute-0 sudo[201929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:52 compute-0 python3.9[201931]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:29:52 compute-0 sudo[201929]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:53 compute-0 auditd[700]: Audit daemon rotating log files
Jan 26 19:29:53 compute-0 sudo[202084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uliqwiadssvezpvlgygwczwxjmvexolz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455793.041193-2015-184062424453593/AnsiballZ_blockinfile.py'
Jan 26 19:29:53 compute-0 sudo[202084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:53 compute-0 python3.9[202086]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:53 compute-0 sudo[202084]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:54 compute-0 sudo[202236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sywofjmsozcjzsnijnvuohkyxiuflvlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455794.0398054-2033-85682004360253/AnsiballZ_command.py'
Jan 26 19:29:54 compute-0 sudo[202236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:54 compute-0 python3.9[202238]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:29:54 compute-0 sudo[202236]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:55 compute-0 sudo[202389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwcrsyqnscuonaqncxhsfffkrlxjcrjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455794.8846185-2049-145350426161594/AnsiballZ_stat.py'
Jan 26 19:29:55 compute-0 sudo[202389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:55 compute-0 python3.9[202391]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 26 19:29:55 compute-0 sudo[202389]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:56 compute-0 sudo[202543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btiwdtuaxeumwtglujlzaqopxtizhbtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455795.6775682-2065-270981001086335/AnsiballZ_command.py'
Jan 26 19:29:56 compute-0 sudo[202543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:56 compute-0 python3.9[202545]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:29:56 compute-0 sudo[202543]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:56 compute-0 sudo[202698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipjqcpcerdgzbewdbqthqjerjuecugxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769455796.5042899-2081-223099819755510/AnsiballZ_file.py'
Jan 26 19:29:56 compute-0 sudo[202698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:29:57 compute-0 python3.9[202700]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:29:57 compute-0 sudo[202698]: pam_unix(sudo:session): session closed for user root
Jan 26 19:29:57 compute-0 sshd-session[183503]: Connection closed by 192.168.122.30 port 35746
Jan 26 19:29:57 compute-0 sshd-session[183500]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:29:57 compute-0 systemd-logind[794]: Session 26 logged out. Waiting for processes to exit.
Jan 26 19:29:57 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Jan 26 19:29:57 compute-0 systemd[1]: session-26.scope: Consumed 1min 32.302s CPU time.
Jan 26 19:29:57 compute-0 systemd-logind[794]: Removed session 26.
Jan 26 19:29:59 compute-0 podman[202725]: 2026-01-26 19:29:59.386472363 +0000 UTC m=+0.128102799 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 26 19:29:59 compute-0 podman[192499]: time="2026-01-26T19:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:29:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:29:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2148 "" "Go-http-client/1.1"
Jan 26 19:30:01 compute-0 openstack_network_exporter[195363]: ERROR   19:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:30:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:30:01 compute-0 openstack_network_exporter[195363]: ERROR   19:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:30:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:30:03 compute-0 podman[202759]: 2026-01-26 19:30:03.343799732 +0000 UTC m=+0.086181993 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Jan 26 19:30:03 compute-0 podman[202760]: 2026-01-26 19:30:03.349371769 +0000 UTC m=+0.081604765 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 19:30:03 compute-0 sshd-session[202794]: Accepted publickey for zuul from 38.102.83.66 port 57558 ssh2: RSA SHA256:z5yXGhLPOexSpG1aW8Zw3EOkOyqOHIRm+m5qLpa/9+A
Jan 26 19:30:03 compute-0 systemd-logind[794]: New session 27 of user zuul.
Jan 26 19:30:03 compute-0 systemd[1]: Started Session 27 of User zuul.
Jan 26 19:30:03 compute-0 sshd-session[202794]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 19:30:03 compute-0 sudo[202821]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajuuvunjoupccgknvvkuikkauijiwfde ; /usr/bin/python3'
Jan 26 19:30:03 compute-0 sudo[202821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:04 compute-0 python3[202823]: ansible-ansible.legacy.dnf Invoked with name=['nfs-utils', 'iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Jan 26 19:30:05 compute-0 sudo[202821]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:05 compute-0 sudo[202848]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kevzpijmmyrjcoxnsrbyhftzubipscvk ; /usr/bin/python3'
Jan 26 19:30:05 compute-0 sudo[202848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:05 compute-0 python3[202850]: ansible-community.general.ini_file Invoked with path=/etc/nfs.conf section=nfsd option=vers3 value=n backup=True mode=0644 state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:30:05 compute-0 sudo[202848]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:06 compute-0 sudo[202876]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afvyfqybwvytocrndlqhplovzfiftywv ; /usr/bin/python3'
Jan 26 19:30:06 compute-0 sudo[202876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:06 compute-0 python3[202878]: ansible-ansible.builtin.systemd_service Invoked with name=rpc-statd.service masked=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Jan 26 19:30:06 compute-0 systemd[1]: Reloading.
Jan 26 19:30:06 compute-0 systemd-rc-local-generator[202903]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:30:06 compute-0 systemd-sysv-generator[202911]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:30:06 compute-0 sudo[202876]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:06 compute-0 sudo[202938]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-runljgphuxtozeyppqwkadraovulipyf ; /usr/bin/python3'
Jan 26 19:30:06 compute-0 sudo[202938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:07 compute-0 python3[202940]: ansible-ansible.builtin.systemd_service Invoked with name=rpcbind.service masked=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Jan 26 19:30:07 compute-0 systemd[1]: Reloading.
Jan 26 19:30:07 compute-0 systemd-sysv-generator[202969]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:30:07 compute-0 systemd-rc-local-generator[202965]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:30:07 compute-0 systemd[1]: rpcbind.service: Current command vanished from the unit file, execution of the command list won't be resumed.
Jan 26 19:30:07 compute-0 sudo[202938]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:07 compute-0 sudo[203001]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqpavficlsmshsxkgxdamxcdpjjqwxvp ; /usr/bin/python3'
Jan 26 19:30:07 compute-0 sudo[203001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:08 compute-0 python3[203003]: ansible-ansible.builtin.systemd_service Invoked with name=rpcbind.socket masked=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None enabled=None force=None
Jan 26 19:30:08 compute-0 systemd[1]: Reloading.
Jan 26 19:30:08 compute-0 systemd-sysv-generator[203037]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:30:08 compute-0 systemd-rc-local-generator[203033]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:30:08 compute-0 systemd[1]: rpcbind.socket: Socket unit configuration has changed while unit has been running, no open socket file descriptor left. The socket unit is not functional until restarted.
Jan 26 19:30:08 compute-0 sudo[203001]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:08 compute-0 sudo[203064]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnmxyemkjcuhkhsghsbiczkicowbjuwv ; /usr/bin/python3'
Jan 26 19:30:08 compute-0 sudo[203064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:08 compute-0 python3[203066]: ansible-ansible.builtin.file Invoked with path=/data/cinder_backend_1 state=directory mode=755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:30:08 compute-0 sudo[203064]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:08 compute-0 sudo[203090]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yubpccvpcrboknlvutulsefwvsfiqslw ; /usr/bin/python3'
Jan 26 19:30:08 compute-0 sudo[203090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:09 compute-0 python3[203092]: ansible-ansible.builtin.file Invoked with path=/data/cinder_backend_2 state=directory mode=755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:30:09 compute-0 sudo[203090]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:09 compute-0 sudo[203116]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enrnqqegmmsukgoweheziblsbrbbqocl ; /usr/bin/python3'
Jan 26 19:30:09 compute-0 sudo[203116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:09 compute-0 python3[203118]: ansible-ansible.builtin.file Invoked with path=/data/cinderbackup state=directory mode=755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:30:09 compute-0 sudo[203116]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:11 compute-0 sudo[203194]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwvyzayoxuvduaofltdckfnplfrfvnxc ; /usr/bin/python3'
Jan 26 19:30:11 compute-0 sudo[203194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:11 compute-0 python3[203196]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/nfs-server.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 26 19:30:11 compute-0 sudo[203194]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:11 compute-0 sudo[203267]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xelwumvmdwpytkfuqlxcpztpcszreimy ; /usr/bin/python3'
Jan 26 19:30:11 compute-0 sudo[203267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:11 compute-0 python3[203269]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/nfs-server.nft mode=0666 src=/home/zuul/.ansible/tmp/ansible-tmp-1769455811.0578582-36984-172176528811531/source _original_basename=tmp0iwygasz follow=False checksum=f91e6a2e98f3d3c48705976f5b33f9e81e7cf7f4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:30:11 compute-0 sudo[203267]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:12 compute-0 sudo[203317]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mltwvvivagbdefdfjpbkrdopbohtmvwk ; /usr/bin/python3'
Jan 26 19:30:12 compute-0 sudo[203317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:12 compute-0 podman[203319]: 2026-01-26 19:30:12.306753654 +0000 UTC m=+0.076279004 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:30:12 compute-0 python3[203320]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/sysconfig/nftables.conf line=include "/etc/nftables/nfs-server.nft" insertafter=EOF state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:30:12 compute-0 sudo[203317]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:12 compute-0 sudo[203368]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aewmavvtnukyvqjepolzvfymtjcmpqph ; /usr/bin/python3'
Jan 26 19:30:12 compute-0 sudo[203368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:13 compute-0 python3[203370]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 26 19:30:13 compute-0 systemd[1]: Stopping Netfilter Tables...
Jan 26 19:30:13 compute-0 systemd[1]: nftables.service: Deactivated successfully.
Jan 26 19:30:13 compute-0 systemd[1]: Stopped Netfilter Tables.
Jan 26 19:30:13 compute-0 systemd[1]: Starting Netfilter Tables...
Jan 26 19:30:13 compute-0 systemd[1]: Finished Netfilter Tables.
Jan 26 19:30:13 compute-0 sudo[203368]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:13 compute-0 sudo[203399]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzdooueshgtezxmwzdecwkoyguabmhzf ; /usr/bin/python3'
Jan 26 19:30:13 compute-0 sudo[203399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:13 compute-0 python3[203401]: ansible-community.general.ini_file Invoked with path=/etc/nfs.conf section=nfsd option=host value=172.18.0.100 backup=True mode=0644 state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:30:13 compute-0 sudo[203399]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:13 compute-0 sudo[203427]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcnzzfmbmerpkapfrrbrztfopefkdzmz ; /usr/bin/python3'
Jan 26 19:30:13 compute-0 sudo[203427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:14 compute-0 python3[203429]: ansible-ansible.builtin.systemd Invoked with name=nfs-server state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 26 19:30:14 compute-0 systemd[1]: Reloading.
Jan 26 19:30:14 compute-0 systemd-sysv-generator[203462]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 26 19:30:14 compute-0 systemd-rc-local-generator[203459]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 26 19:30:14 compute-0 systemd[1]: rpcbind.socket: Socket unit configuration has changed while unit has been running, no open socket file descriptor left. The socket unit is not functional until restarted.
Jan 26 19:30:14 compute-0 systemd[1]: Mounting NFSD configuration filesystem...
Jan 26 19:30:14 compute-0 systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 19:30:14 compute-0 systemd[1]: Starting NFSv4 ID-name mapping service...
Jan 26 19:30:14 compute-0 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 26 19:30:14 compute-0 rpc.idmapd[203470]: Setting log level to 0
Jan 26 19:30:14 compute-0 systemd[1]: Started NFSv4 ID-name mapping service.
Jan 26 19:30:14 compute-0 systemd[1]: Mounted NFSD configuration filesystem.
Jan 26 19:30:14 compute-0 systemd[1]: Starting NFS Mount Daemon...
Jan 26 19:30:14 compute-0 systemd[1]: Starting NFSv4 Client Tracking Daemon...
Jan 26 19:30:14 compute-0 systemd[1]: Started NFSv4 Client Tracking Daemon.
Jan 26 19:30:14 compute-0 systemd[1]: Started NFS Mount Daemon.
Jan 26 19:30:14 compute-0 rpc.mountd[203478]: Version 2.5.4 starting
Jan 26 19:30:14 compute-0 systemd[1]: Starting NFS server and services...
Jan 26 19:30:14 compute-0 kernel: RPC: Registered rdma transport module.
Jan 26 19:30:14 compute-0 kernel: RPC: Registered rdma backchannel transport module.
Jan 26 19:30:14 compute-0 kernel: NFSD: Using nfsdcld client tracking operations.
Jan 26 19:30:14 compute-0 kernel: NFSD: no clients to reclaim, skipping NFSv4 grace period (net f0000000)
Jan 26 19:30:15 compute-0 systemd[1]: Reloading GSSAPI Proxy Daemon...
Jan 26 19:30:15 compute-0 systemd[1]: Reloaded GSSAPI Proxy Daemon.
Jan 26 19:30:15 compute-0 systemd[1]: Finished NFS server and services.
Jan 26 19:30:15 compute-0 sudo[203427]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:15 compute-0 sudo[203519]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxymkwntsvhjqwiqmmrmbsmsqutlzscy ; /usr/bin/python3'
Jan 26 19:30:15 compute-0 sudo[203519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:15 compute-0 python3[203521]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/exports line=/data/cinder_backend_1 172.18.0.0/24(rw,sync,no_root_squash) state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:30:15 compute-0 sudo[203519]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:15 compute-0 sudo[203545]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-solkcwwqzxugunudtxpxvfgdjqfvbbmv ; /usr/bin/python3'
Jan 26 19:30:15 compute-0 sudo[203545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:15 compute-0 python3[203547]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/exports line=/data/cinder_backend_2 172.18.0.0/24(rw,sync,no_root_squash) state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:30:15 compute-0 sudo[203545]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:15 compute-0 sudo[203571]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjkmjseptdxvjwmzzzywadbjotqktcyn ; /usr/bin/python3'
Jan 26 19:30:15 compute-0 sudo[203571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:16 compute-0 python3[203573]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/exports line=/data/cinderbackup 172.18.0.0/24(rw,sync,no_root_squash) state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 26 19:30:16 compute-0 sudo[203571]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:16 compute-0 sudo[203597]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anczqmfomnyrabqdqesmvbnqylcumyyj ; /usr/bin/python3'
Jan 26 19:30:16 compute-0 sudo[203597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 19:30:16 compute-0 python3[203599]: ansible-ansible.legacy.command Invoked with _raw_params=exportfs -a _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 26 19:30:16 compute-0 sudo[203597]: pam_unix(sudo:session): session closed for user root
Jan 26 19:30:21 compute-0 sshd-session[203601]: Invalid user hms from 193.32.162.151 port 49848
Jan 26 19:30:22 compute-0 sshd-session[203601]: Connection closed by invalid user hms 193.32.162.151 port 49848 [preauth]
Jan 26 19:30:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:30:24.014 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:30:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:30:24.015 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:30:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:30:24.016 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:30:29 compute-0 podman[192499]: time="2026-01-26T19:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:30:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:30:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2151 "" "Go-http-client/1.1"
Jan 26 19:30:30 compute-0 podman[203604]: 2026-01-26 19:30:30.433827304 +0000 UTC m=+0.140783686 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 26 19:30:31 compute-0 openstack_network_exporter[195363]: ERROR   19:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:30:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:30:31 compute-0 openstack_network_exporter[195363]: ERROR   19:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:30:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:30:34 compute-0 podman[203631]: 2026-01-26 19:30:34.338562424 +0000 UTC m=+0.078018412 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Jan 26 19:30:34 compute-0 podman[203630]: 2026-01-26 19:30:34.348620487 +0000 UTC m=+0.096213244 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.buildah.version=1.33.7)
Jan 26 19:30:39 compute-0 nova_compute[183177]: 2026-01-26 19:30:39.790 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:30:39 compute-0 nova_compute[183177]: 2026-01-26 19:30:39.790 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:30:39 compute-0 nova_compute[183177]: 2026-01-26 19:30:39.791 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:30:39 compute-0 nova_compute[183177]: 2026-01-26 19:30:39.791 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:30:39 compute-0 nova_compute[183177]: 2026-01-26 19:30:39.791 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:30:39 compute-0 nova_compute[183177]: 2026-01-26 19:30:39.792 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:30:39 compute-0 nova_compute[183177]: 2026-01-26 19:30:39.792 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:30:39 compute-0 nova_compute[183177]: 2026-01-26 19:30:39.792 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:30:39 compute-0 nova_compute[183177]: 2026-01-26 19:30:39.793 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:30:40 compute-0 nova_compute[183177]: 2026-01-26 19:30:40.313 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:30:40 compute-0 nova_compute[183177]: 2026-01-26 19:30:40.313 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:30:40 compute-0 nova_compute[183177]: 2026-01-26 19:30:40.313 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:30:40 compute-0 nova_compute[183177]: 2026-01-26 19:30:40.314 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:30:40 compute-0 nova_compute[183177]: 2026-01-26 19:30:40.513 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:30:40 compute-0 nova_compute[183177]: 2026-01-26 19:30:40.515 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:30:40 compute-0 nova_compute[183177]: 2026-01-26 19:30:40.552 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:30:40 compute-0 nova_compute[183177]: 2026-01-26 19:30:40.553 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6025MB free_disk=73.1373176574707GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:30:40 compute-0 nova_compute[183177]: 2026-01-26 19:30:40.553 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:30:40 compute-0 nova_compute[183177]: 2026-01-26 19:30:40.553 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:30:41 compute-0 nova_compute[183177]: 2026-01-26 19:30:41.609 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:30:41 compute-0 nova_compute[183177]: 2026-01-26 19:30:41.610 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:30:40 up 54 min,  0 user,  load average: 0.73, 0.96, 0.74\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:30:41 compute-0 nova_compute[183177]: 2026-01-26 19:30:41.644 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:30:42 compute-0 nova_compute[183177]: 2026-01-26 19:30:42.153 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:30:42 compute-0 nova_compute[183177]: 2026-01-26 19:30:42.664 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:30:42 compute-0 nova_compute[183177]: 2026-01-26 19:30:42.664 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:30:43 compute-0 podman[203671]: 2026-01-26 19:30:43.01235014 +0000 UTC m=+0.109632730 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:30:59 compute-0 podman[192499]: time="2026-01-26T19:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:30:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:30:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2154 "" "Go-http-client/1.1"
Jan 26 19:31:01 compute-0 podman[203705]: 2026-01-26 19:31:01.390217424 +0000 UTC m=+0.128869960 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 26 19:31:01 compute-0 openstack_network_exporter[195363]: ERROR   19:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:31:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:31:01 compute-0 openstack_network_exporter[195363]: ERROR   19:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:31:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:31:05 compute-0 podman[203732]: 2026-01-26 19:31:05.324066352 +0000 UTC m=+0.068685141 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 26 19:31:05 compute-0 podman[203731]: 2026-01-26 19:31:05.338635596 +0000 UTC m=+0.079893585 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Jan 26 19:31:13 compute-0 podman[203771]: 2026-01-26 19:31:13.301996921 +0000 UTC m=+0.060372097 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 19:31:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:31:24.018 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:31:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:31:24.018 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:31:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:31:24.018 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:31:29 compute-0 podman[192499]: time="2026-01-26T19:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:31:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:31:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Jan 26 19:31:31 compute-0 openstack_network_exporter[195363]: ERROR   19:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:31:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:31:31 compute-0 openstack_network_exporter[195363]: ERROR   19:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:31:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:31:32 compute-0 podman[203796]: 2026-01-26 19:31:32.378589974 +0000 UTC m=+0.134018090 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest)
Jan 26 19:31:36 compute-0 podman[203823]: 2026-01-26 19:31:36.312255197 +0000 UTC m=+0.055871214 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 19:31:36 compute-0 podman[203822]: 2026-01-26 19:31:36.315673519 +0000 UTC m=+0.064720854 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Jan 26 19:31:40 compute-0 nova_compute[183177]: 2026-01-26 19:31:40.023 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:31:40 compute-0 nova_compute[183177]: 2026-01-26 19:31:40.024 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:31:40 compute-0 nova_compute[183177]: 2026-01-26 19:31:40.776 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:31:40 compute-0 nova_compute[183177]: 2026-01-26 19:31:40.777 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:31:40 compute-0 nova_compute[183177]: 2026-01-26 19:31:40.778 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:31:40 compute-0 nova_compute[183177]: 2026-01-26 19:31:40.778 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:31:40 compute-0 nova_compute[183177]: 2026-01-26 19:31:40.779 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:31:40 compute-0 nova_compute[183177]: 2026-01-26 19:31:40.779 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:31:40 compute-0 nova_compute[183177]: 2026-01-26 19:31:40.779 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:31:40 compute-0 nova_compute[183177]: 2026-01-26 19:31:40.780 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:31:41 compute-0 nova_compute[183177]: 2026-01-26 19:31:41.292 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:31:41 compute-0 nova_compute[183177]: 2026-01-26 19:31:41.293 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:31:41 compute-0 nova_compute[183177]: 2026-01-26 19:31:41.293 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:31:41 compute-0 nova_compute[183177]: 2026-01-26 19:31:41.293 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:31:41 compute-0 nova_compute[183177]: 2026-01-26 19:31:41.607 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:31:41 compute-0 nova_compute[183177]: 2026-01-26 19:31:41.608 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:31:41 compute-0 nova_compute[183177]: 2026-01-26 19:31:41.627 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:31:41 compute-0 nova_compute[183177]: 2026-01-26 19:31:41.627 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6128MB free_disk=73.1373176574707GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:31:41 compute-0 nova_compute[183177]: 2026-01-26 19:31:41.628 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:31:41 compute-0 nova_compute[183177]: 2026-01-26 19:31:41.628 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:31:42 compute-0 nova_compute[183177]: 2026-01-26 19:31:42.848 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:31:42 compute-0 nova_compute[183177]: 2026-01-26 19:31:42.848 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:31:41 up 56 min,  0 user,  load average: 0.27, 0.78, 0.69\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:31:42 compute-0 nova_compute[183177]: 2026-01-26 19:31:42.880 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:31:43 compute-0 nova_compute[183177]: 2026-01-26 19:31:43.407 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:31:44 compute-0 nova_compute[183177]: 2026-01-26 19:31:44.025 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:31:44 compute-0 nova_compute[183177]: 2026-01-26 19:31:44.026 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.397s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:31:44 compute-0 podman[203863]: 2026-01-26 19:31:44.326858818 +0000 UTC m=+0.074885419 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:32:03 compute-0 podman[203888]: 2026-01-26 19:32:03.38836596 +0000 UTC m=+0.136623800 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 26 19:32:07 compute-0 podman[203915]: 2026-01-26 19:32:07.345450578 +0000 UTC m=+0.085904160 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Jan 26 19:32:07 compute-0 podman[203914]: 2026-01-26 19:32:07.362273625 +0000 UTC m=+0.109086960 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6)
Jan 26 19:32:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:12.981 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:32:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:12.982 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:32:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:12.986 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:32:15 compute-0 podman[203955]: 2026-01-26 19:32:15.330688936 +0000 UTC m=+0.074333975 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:32:18 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:18.311 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:21:84 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-8902ad13-aba9-426b-9627-0c1b5e832b90', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8902ad13-aba9-426b-9627-0c1b5e832b90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbc26b645f2a4d108c00608f11fdebb2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69100a54-cdd0-4f7c-b5ad-6f80bc8c4a47, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=27b184cd-e30d-4188-97b1-bbab17ad9515) old=Port_Binding(mac=['fa:16:3e:f3:21:84'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8902ad13-aba9-426b-9627-0c1b5e832b90', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8902ad13-aba9-426b-9627-0c1b5e832b90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bbc26b645f2a4d108c00608f11fdebb2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:32:18 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:18.312 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 27b184cd-e30d-4188-97b1-bbab17ad9515 in datapath 8902ad13-aba9-426b-9627-0c1b5e832b90 updated
Jan 26 19:32:18 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:18.315 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8902ad13-aba9-426b-9627-0c1b5e832b90, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:32:18 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:18.317 104672 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpope3wdzh/privsep.sock']
Jan 26 19:32:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:19.188 104672 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 26 19:32:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:19.189 104672 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpope3wdzh/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Jan 26 19:32:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:18.995 203984 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 19:32:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:19.003 203984 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 19:32:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:19.006 203984 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 26 19:32:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:19.007 203984 INFO oslo.privsep.daemon [-] privsep daemon running as pid 203984
Jan 26 19:32:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:19.191 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f0d2bad9-79bc-4985-8076-d43e6dc11a06]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:32:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:19.666 203984 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:32:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:19.666 203984 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:32:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:19.666 203984 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:32:20 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:20.117 203984 INFO oslo_service.backend [-] Loading backend: eventlet
Jan 26 19:32:20 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:20.122 203984 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Jan 26 19:32:20 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:20.163 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1b118547-35e6-4645-8a0d-977e17c0ab5e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:32:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:24.019 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:32:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:24.020 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:32:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:32:24.020 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:32:29 compute-0 podman[192499]: time="2026-01-26T19:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:32:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:32:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Jan 26 19:32:30 compute-0 sshd-session[203991]: Invalid user docker from 193.32.162.151 port 55452
Jan 26 19:32:30 compute-0 sshd-session[203991]: Connection closed by invalid user docker 193.32.162.151 port 55452 [preauth]
Jan 26 19:32:31 compute-0 openstack_network_exporter[195363]: ERROR   19:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:32:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:32:31 compute-0 openstack_network_exporter[195363]: ERROR   19:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:32:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:32:33 compute-0 nova_compute[183177]: 2026-01-26 19:32:33.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:32:33 compute-0 nova_compute[183177]: 2026-01-26 19:32:33.155 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 19:32:33 compute-0 nova_compute[183177]: 2026-01-26 19:32:33.972 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 19:32:33 compute-0 nova_compute[183177]: 2026-01-26 19:32:33.974 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:32:33 compute-0 nova_compute[183177]: 2026-01-26 19:32:33.975 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 19:32:34 compute-0 podman[203994]: 2026-01-26 19:32:34.38650359 +0000 UTC m=+0.130189655 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 19:32:34 compute-0 nova_compute[183177]: 2026-01-26 19:32:34.484 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:32:37 compute-0 nova_compute[183177]: 2026-01-26 19:32:37.004 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:32:37 compute-0 nova_compute[183177]: 2026-01-26 19:32:37.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:32:37 compute-0 nova_compute[183177]: 2026-01-26 19:32:37.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:32:37 compute-0 nova_compute[183177]: 2026-01-26 19:32:37.902 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:32:37 compute-0 nova_compute[183177]: 2026-01-26 19:32:37.902 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:32:37 compute-0 nova_compute[183177]: 2026-01-26 19:32:37.903 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:32:37 compute-0 nova_compute[183177]: 2026-01-26 19:32:37.903 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:32:38 compute-0 nova_compute[183177]: 2026-01-26 19:32:38.051 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:32:38 compute-0 nova_compute[183177]: 2026-01-26 19:32:38.052 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:32:38 compute-0 nova_compute[183177]: 2026-01-26 19:32:38.067 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:32:38 compute-0 nova_compute[183177]: 2026-01-26 19:32:38.067 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6040MB free_disk=73.13733673095703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:32:38 compute-0 nova_compute[183177]: 2026-01-26 19:32:38.068 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:32:38 compute-0 nova_compute[183177]: 2026-01-26 19:32:38.068 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:32:38 compute-0 podman[204022]: 2026-01-26 19:32:38.319976613 +0000 UTC m=+0.064979150 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20260120)
Jan 26 19:32:38 compute-0 podman[204021]: 2026-01-26 19:32:38.321770063 +0000 UTC m=+0.071238461 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Jan 26 19:32:39 compute-0 nova_compute[183177]: 2026-01-26 19:32:39.121 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:32:39 compute-0 nova_compute[183177]: 2026-01-26 19:32:39.122 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:32:38 up 56 min,  0 user,  load average: 0.16, 0.65, 0.65\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:32:39 compute-0 nova_compute[183177]: 2026-01-26 19:32:39.145 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:32:40 compute-0 nova_compute[183177]: 2026-01-26 19:32:40.028 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:32:40 compute-0 nova_compute[183177]: 2026-01-26 19:32:40.540 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:32:40 compute-0 nova_compute[183177]: 2026-01-26 19:32:40.541 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.473s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:32:41 compute-0 nova_compute[183177]: 2026-01-26 19:32:41.536 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:32:41 compute-0 nova_compute[183177]: 2026-01-26 19:32:41.538 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:32:41 compute-0 nova_compute[183177]: 2026-01-26 19:32:41.539 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:32:41 compute-0 nova_compute[183177]: 2026-01-26 19:32:41.539 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:32:41 compute-0 nova_compute[183177]: 2026-01-26 19:32:41.539 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:32:41 compute-0 nova_compute[183177]: 2026-01-26 19:32:41.540 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:32:46 compute-0 podman[204061]: 2026-01-26 19:32:46.306430535 +0000 UTC m=+0.057576637 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 19:32:59 compute-0 podman[192499]: time="2026-01-26T19:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:32:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:32:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2151 "" "Go-http-client/1.1"
Jan 26 19:33:01 compute-0 openstack_network_exporter[195363]: ERROR   19:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:33:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:33:01 compute-0 openstack_network_exporter[195363]: ERROR   19:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:33:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:33:05 compute-0 podman[204086]: 2026-01-26 19:33:05.368805656 +0000 UTC m=+0.111489780 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 19:33:09 compute-0 podman[204112]: 2026-01-26 19:33:09.34856265 +0000 UTC m=+0.079989588 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 26 19:33:09 compute-0 podman[204111]: 2026-01-26 19:33:09.354022316 +0000 UTC m=+0.097134486 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Jan 26 19:33:17 compute-0 podman[204149]: 2026-01-26 19:33:17.321288842 +0000 UTC m=+0.072486438 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 19:33:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:33:24.021 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:33:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:33:24.021 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:33:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:33:24.022 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:33:29 compute-0 podman[192499]: time="2026-01-26T19:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:33:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:33:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Jan 26 19:33:31 compute-0 openstack_network_exporter[195363]: ERROR   19:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:33:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:33:31 compute-0 openstack_network_exporter[195363]: ERROR   19:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:33:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:33:35 compute-0 nova_compute[183177]: 2026-01-26 19:33:35.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:33:36 compute-0 podman[204174]: 2026-01-26 19:33:36.388006176 +0000 UTC m=+0.131620538 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 26 19:33:37 compute-0 nova_compute[183177]: 2026-01-26 19:33:37.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:33:37 compute-0 nova_compute[183177]: 2026-01-26 19:33:37.671 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:33:37 compute-0 nova_compute[183177]: 2026-01-26 19:33:37.672 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:33:37 compute-0 nova_compute[183177]: 2026-01-26 19:33:37.672 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:33:37 compute-0 nova_compute[183177]: 2026-01-26 19:33:37.672 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:33:37 compute-0 nova_compute[183177]: 2026-01-26 19:33:37.935 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:33:37 compute-0 nova_compute[183177]: 2026-01-26 19:33:37.937 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:33:37 compute-0 nova_compute[183177]: 2026-01-26 19:33:37.958 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:33:37 compute-0 nova_compute[183177]: 2026-01-26 19:33:37.959 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6049MB free_disk=73.13733673095703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:33:37 compute-0 nova_compute[183177]: 2026-01-26 19:33:37.959 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:33:37 compute-0 nova_compute[183177]: 2026-01-26 19:33:37.959 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:33:39 compute-0 nova_compute[183177]: 2026-01-26 19:33:39.050 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:33:39 compute-0 nova_compute[183177]: 2026-01-26 19:33:39.051 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:33:37 up 57 min,  0 user,  load average: 0.18, 0.56, 0.62\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:33:39 compute-0 nova_compute[183177]: 2026-01-26 19:33:39.108 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing inventories for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 19:33:39 compute-0 nova_compute[183177]: 2026-01-26 19:33:39.147 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating ProviderTree inventory for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 19:33:39 compute-0 nova_compute[183177]: 2026-01-26 19:33:39.147 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 19:33:39 compute-0 nova_compute[183177]: 2026-01-26 19:33:39.166 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing aggregate associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 19:33:39 compute-0 nova_compute[183177]: 2026-01-26 19:33:39.190 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing trait associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_SCSI,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 19:33:39 compute-0 nova_compute[183177]: 2026-01-26 19:33:39.211 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:33:39 compute-0 nova_compute[183177]: 2026-01-26 19:33:39.719 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:33:40 compute-0 nova_compute[183177]: 2026-01-26 19:33:40.234 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:33:40 compute-0 nova_compute[183177]: 2026-01-26 19:33:40.234 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.275s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:33:40 compute-0 podman[204203]: 2026-01-26 19:33:40.308828727 +0000 UTC m=+0.055975206 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:33:40 compute-0 podman[204202]: 2026-01-26 19:33:40.339592129 +0000 UTC m=+0.083988275 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, container_name=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 26 19:33:41 compute-0 nova_compute[183177]: 2026-01-26 19:33:41.234 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:33:41 compute-0 nova_compute[183177]: 2026-01-26 19:33:41.235 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:33:41 compute-0 nova_compute[183177]: 2026-01-26 19:33:41.235 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:33:41 compute-0 nova_compute[183177]: 2026-01-26 19:33:41.235 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:33:41 compute-0 nova_compute[183177]: 2026-01-26 19:33:41.235 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:33:41 compute-0 nova_compute[183177]: 2026-01-26 19:33:41.235 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:33:42 compute-0 nova_compute[183177]: 2026-01-26 19:33:42.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:33:43 compute-0 nova_compute[183177]: 2026-01-26 19:33:43.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:33:48 compute-0 podman[204242]: 2026-01-26 19:33:48.309421503 +0000 UTC m=+0.059490981 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 19:33:59 compute-0 podman[192499]: time="2026-01-26T19:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:33:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:33:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Jan 26 19:34:01 compute-0 openstack_network_exporter[195363]: ERROR   19:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:34:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:34:01 compute-0 openstack_network_exporter[195363]: ERROR   19:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:34:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:34:07 compute-0 podman[204265]: 2026-01-26 19:34:07.387937913 +0000 UTC m=+0.133525909 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 19:34:11 compute-0 podman[204292]: 2026-01-26 19:34:11.329640444 +0000 UTC m=+0.068834993 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:34:11 compute-0 podman[204291]: 2026-01-26 19:34:11.355396347 +0000 UTC m=+0.096310380 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Jan 26 19:34:19 compute-0 podman[204331]: 2026-01-26 19:34:19.332241076 +0000 UTC m=+0.075049814 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:34:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:34:24.023 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:34:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:34:24.023 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:34:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:34:24.024 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:34:29 compute-0 podman[192499]: time="2026-01-26T19:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:34:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:34:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Jan 26 19:34:31 compute-0 openstack_network_exporter[195363]: ERROR   19:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:34:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:34:31 compute-0 openstack_network_exporter[195363]: ERROR   19:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:34:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:34:37 compute-0 nova_compute[183177]: 2026-01-26 19:34:37.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:34:38 compute-0 nova_compute[183177]: 2026-01-26 19:34:38.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:34:38 compute-0 podman[204356]: 2026-01-26 19:34:38.391106223 +0000 UTC m=+0.137357309 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Jan 26 19:34:38 compute-0 nova_compute[183177]: 2026-01-26 19:34:38.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:34:38 compute-0 nova_compute[183177]: 2026-01-26 19:34:38.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:34:38 compute-0 nova_compute[183177]: 2026-01-26 19:34:38.675 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:34:38 compute-0 nova_compute[183177]: 2026-01-26 19:34:38.675 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:34:38 compute-0 nova_compute[183177]: 2026-01-26 19:34:38.891 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:34:38 compute-0 nova_compute[183177]: 2026-01-26 19:34:38.893 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:34:38 compute-0 nova_compute[183177]: 2026-01-26 19:34:38.923 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:34:38 compute-0 nova_compute[183177]: 2026-01-26 19:34:38.924 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6049MB free_disk=73.13760757446289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:34:38 compute-0 nova_compute[183177]: 2026-01-26 19:34:38.924 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:34:38 compute-0 nova_compute[183177]: 2026-01-26 19:34:38.925 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:34:39 compute-0 nova_compute[183177]: 2026-01-26 19:34:39.979 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:34:39 compute-0 nova_compute[183177]: 2026-01-26 19:34:39.979 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:34:38 up 58 min,  0 user,  load average: 0.07, 0.46, 0.58\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:34:40 compute-0 nova_compute[183177]: 2026-01-26 19:34:40.013 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:34:40 compute-0 nova_compute[183177]: 2026-01-26 19:34:40.522 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:34:41 compute-0 nova_compute[183177]: 2026-01-26 19:34:41.035 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:34:41 compute-0 nova_compute[183177]: 2026-01-26 19:34:41.036 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:34:42 compute-0 sshd-session[204383]: Invalid user admin from 193.32.162.151 port 32794
Jan 26 19:34:42 compute-0 nova_compute[183177]: 2026-01-26 19:34:42.032 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:34:42 compute-0 nova_compute[183177]: 2026-01-26 19:34:42.032 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:34:42 compute-0 nova_compute[183177]: 2026-01-26 19:34:42.033 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:34:42 compute-0 nova_compute[183177]: 2026-01-26 19:34:42.033 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:34:42 compute-0 nova_compute[183177]: 2026-01-26 19:34:42.033 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:34:42 compute-0 podman[204385]: 2026-01-26 19:34:42.106133835 +0000 UTC m=+0.085784470 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Jan 26 19:34:42 compute-0 podman[204386]: 2026-01-26 19:34:42.107210303 +0000 UTC m=+0.077540368 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 19:34:42 compute-0 sshd-session[204383]: Connection closed by invalid user admin 193.32.162.151 port 32794 [preauth]
Jan 26 19:34:42 compute-0 nova_compute[183177]: 2026-01-26 19:34:42.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:34:44 compute-0 nova_compute[183177]: 2026-01-26 19:34:44.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:34:50 compute-0 podman[204423]: 2026-01-26 19:34:50.348109673 +0000 UTC m=+0.088907790 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:34:59 compute-0 podman[192499]: time="2026-01-26T19:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:34:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:34:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Jan 26 19:35:01 compute-0 openstack_network_exporter[195363]: ERROR   19:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:35:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:35:01 compute-0 openstack_network_exporter[195363]: ERROR   19:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:35:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:35:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:06.569 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:35:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:06.572 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:35:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:08.812 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:bc:9b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7a7c29d27ae4d23a91016528314c1cc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d27c55d-9c87-4908-9254-adcd14eb938f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4608fb84-b3a0-4b98-8d93-9834a5bb893e) old=Port_Binding(mac=['fa:16:3e:2e:bc:9b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7a7c29d27ae4d23a91016528314c1cc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:35:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:08.813 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4608fb84-b3a0-4b98-8d93-9834a5bb893e in datapath f8f391ed-cb1d-40c8-8c35-338c16ca4ecb updated
Jan 26 19:35:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:08.815 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f8f391ed-cb1d-40c8-8c35-338c16ca4ecb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:35:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:08.817 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e6a886e1-253e-492e-8347-c9066d3488a7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:35:09 compute-0 podman[204448]: 2026-01-26 19:35:09.398559951 +0000 UTC m=+0.135270531 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260120, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 26 19:35:12 compute-0 podman[204476]: 2026-01-26 19:35:12.320679254 +0000 UTC m=+0.068898585 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 19:35:12 compute-0 podman[204475]: 2026-01-26 19:35:12.351828552 +0000 UTC m=+0.096457706 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, container_name=openstack_network_exporter, config_id=openstack_network_exporter)
Jan 26 19:35:15 compute-0 sshd-session[202797]: Received disconnect from 38.102.83.66 port 57558:11: disconnected by user
Jan 26 19:35:15 compute-0 sshd-session[202797]: Disconnected from user zuul 38.102.83.66 port 57558
Jan 26 19:35:15 compute-0 sshd-session[202794]: pam_unix(sshd:session): session closed for user zuul
Jan 26 19:35:15 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Jan 26 19:35:15 compute-0 systemd[1]: session-27.scope: Consumed 7.458s CPU time.
Jan 26 19:35:15 compute-0 systemd-logind[794]: Session 27 logged out. Waiting for processes to exit.
Jan 26 19:35:15 compute-0 systemd-logind[794]: Removed session 27.
Jan 26 19:35:16 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:16.575 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:35:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:19.315 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:64:40 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-71c42b51-e2bc-4bce-ab74-3bbadccc8d73', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-71c42b51-e2bc-4bce-ab74-3bbadccc8d73', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d9c65b920754bec81e0adf7f8b9ee86', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c614a00-3cdb-4f57-9a6e-ec62869ff340, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c4923dc8-fad7-440b-abc1-a0d6c76ad4ce) old=Port_Binding(mac=['fa:16:3e:e9:64:40'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-71c42b51-e2bc-4bce-ab74-3bbadccc8d73', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-71c42b51-e2bc-4bce-ab74-3bbadccc8d73', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d9c65b920754bec81e0adf7f8b9ee86', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:35:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:19.317 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c4923dc8-fad7-440b-abc1-a0d6c76ad4ce in datapath 71c42b51-e2bc-4bce-ab74-3bbadccc8d73 updated
Jan 26 19:35:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:19.318 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 71c42b51-e2bc-4bce-ab74-3bbadccc8d73, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:35:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:19.319 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8e14164c-cca1-410d-98c9-0f60d29fa39a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:35:21 compute-0 podman[204513]: 2026-01-26 19:35:21.336468584 +0000 UTC m=+0.079700026 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:35:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:24.025 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:35:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:24.025 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:35:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:35:24.025 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:35:29 compute-0 podman[192499]: time="2026-01-26T19:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:35:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:35:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2154 "" "Go-http-client/1.1"
Jan 26 19:35:31 compute-0 openstack_network_exporter[195363]: ERROR   19:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:35:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:35:31 compute-0 openstack_network_exporter[195363]: ERROR   19:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:35:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:35:38 compute-0 nova_compute[183177]: 2026-01-26 19:35:38.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:35:38 compute-0 nova_compute[183177]: 2026-01-26 19:35:38.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:35:38 compute-0 nova_compute[183177]: 2026-01-26 19:35:38.670 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:35:38 compute-0 nova_compute[183177]: 2026-01-26 19:35:38.670 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:35:38 compute-0 nova_compute[183177]: 2026-01-26 19:35:38.671 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:35:38 compute-0 nova_compute[183177]: 2026-01-26 19:35:38.671 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:35:38 compute-0 nova_compute[183177]: 2026-01-26 19:35:38.895 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:35:38 compute-0 nova_compute[183177]: 2026-01-26 19:35:38.897 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:35:38 compute-0 nova_compute[183177]: 2026-01-26 19:35:38.917 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:35:38 compute-0 nova_compute[183177]: 2026-01-26 19:35:38.919 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6056MB free_disk=73.13760375976562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:35:38 compute-0 nova_compute[183177]: 2026-01-26 19:35:38.919 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:35:38 compute-0 nova_compute[183177]: 2026-01-26 19:35:38.920 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:35:40 compute-0 nova_compute[183177]: 2026-01-26 19:35:40.348 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:35:40 compute-0 nova_compute[183177]: 2026-01-26 19:35:40.348 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:35:38 up 59 min,  0 user,  load average: 0.02, 0.37, 0.54\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:35:40 compute-0 nova_compute[183177]: 2026-01-26 19:35:40.380 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:35:40 compute-0 podman[204539]: 2026-01-26 19:35:40.424134407 +0000 UTC m=+0.163915172 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:35:40 compute-0 nova_compute[183177]: 2026-01-26 19:35:40.889 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:35:41 compute-0 nova_compute[183177]: 2026-01-26 19:35:41.400 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:35:41 compute-0 nova_compute[183177]: 2026-01-26 19:35:41.401 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.481s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:35:43 compute-0 podman[204567]: 2026-01-26 19:35:43.331519593 +0000 UTC m=+0.068548786 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 19:35:43 compute-0 podman[204566]: 2026-01-26 19:35:43.365105727 +0000 UTC m=+0.107055402 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, managed_by=edpm_ansible)
Jan 26 19:35:43 compute-0 nova_compute[183177]: 2026-01-26 19:35:43.396 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:35:43 compute-0 nova_compute[183177]: 2026-01-26 19:35:43.397 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:35:43 compute-0 nova_compute[183177]: 2026-01-26 19:35:43.397 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:35:43 compute-0 nova_compute[183177]: 2026-01-26 19:35:43.397 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:35:43 compute-0 nova_compute[183177]: 2026-01-26 19:35:43.397 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:35:43 compute-0 nova_compute[183177]: 2026-01-26 19:35:43.398 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:35:44 compute-0 nova_compute[183177]: 2026-01-26 19:35:44.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:35:47 compute-0 nova_compute[183177]: 2026-01-26 19:35:47.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:35:51 compute-0 nova_compute[183177]: 2026-01-26 19:35:51.229 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Acquiring lock "1fb6e265-c050-4338-93f9-cbaebb25bca1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:35:51 compute-0 nova_compute[183177]: 2026-01-26 19:35:51.230 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:35:51 compute-0 nova_compute[183177]: 2026-01-26 19:35:51.735 183181 DEBUG nova.compute.manager [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 19:35:52 compute-0 podman[204604]: 2026-01-26 19:35:52.302017923 +0000 UTC m=+0.055552226 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 19:35:52 compute-0 nova_compute[183177]: 2026-01-26 19:35:52.338 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:35:52 compute-0 nova_compute[183177]: 2026-01-26 19:35:52.339 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:35:52 compute-0 nova_compute[183177]: 2026-01-26 19:35:52.344 183181 DEBUG nova.virt.hardware [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 19:35:52 compute-0 nova_compute[183177]: 2026-01-26 19:35:52.344 183181 INFO nova.compute.claims [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Claim successful on node compute-0.ctlplane.example.com
Jan 26 19:35:53 compute-0 nova_compute[183177]: 2026-01-26 19:35:53.416 183181 DEBUG nova.compute.provider_tree [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:35:53 compute-0 nova_compute[183177]: 2026-01-26 19:35:53.926 183181 DEBUG nova.scheduler.client.report [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:35:54 compute-0 nova_compute[183177]: 2026-01-26 19:35:54.437 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:35:54 compute-0 nova_compute[183177]: 2026-01-26 19:35:54.438 183181 DEBUG nova.compute.manager [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 19:35:54 compute-0 nova_compute[183177]: 2026-01-26 19:35:54.954 183181 DEBUG nova.compute.manager [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 19:35:54 compute-0 nova_compute[183177]: 2026-01-26 19:35:54.955 183181 DEBUG nova.network.neutron [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 19:35:54 compute-0 nova_compute[183177]: 2026-01-26 19:35:54.957 183181 WARNING neutronclient.v2_0.client [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:35:54 compute-0 nova_compute[183177]: 2026-01-26 19:35:54.960 183181 WARNING neutronclient.v2_0.client [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:35:55 compute-0 nova_compute[183177]: 2026-01-26 19:35:55.476 183181 INFO nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 19:35:55 compute-0 nova_compute[183177]: 2026-01-26 19:35:55.989 183181 DEBUG nova.compute.manager [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 19:35:56 compute-0 nova_compute[183177]: 2026-01-26 19:35:56.933 183181 DEBUG nova.network.neutron [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Successfully created port: fe57ed2e-f52e-4f3e-a142-07f155a77aeb _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 19:35:57 compute-0 nova_compute[183177]: 2026-01-26 19:35:57.012 183181 DEBUG nova.compute.manager [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 19:35:57 compute-0 nova_compute[183177]: 2026-01-26 19:35:57.014 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 19:35:57 compute-0 nova_compute[183177]: 2026-01-26 19:35:57.015 183181 INFO nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Creating image(s)
Jan 26 19:35:57 compute-0 nova_compute[183177]: 2026-01-26 19:35:57.015 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Acquiring lock "/var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:35:57 compute-0 nova_compute[183177]: 2026-01-26 19:35:57.016 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "/var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:35:57 compute-0 nova_compute[183177]: 2026-01-26 19:35:57.017 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "/var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:35:57 compute-0 nova_compute[183177]: 2026-01-26 19:35:57.017 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:35:57 compute-0 nova_compute[183177]: 2026-01-26 19:35:57.018 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.016 183181 DEBUG nova.network.neutron [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Successfully updated port: fe57ed2e-f52e-4f3e-a142-07f155a77aeb _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.526 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Acquiring lock "refresh_cache-1fb6e265-c050-4338-93f9-cbaebb25bca1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.527 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Acquired lock "refresh_cache-1fb6e265-c050-4338-93f9-cbaebb25bca1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.527 183181 DEBUG nova.network.neutron [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.544 183181 DEBUG nova.compute.manager [req-25cf9bf4-8306-4530-85e9-1987e4bd1bc6 req-3a6c89d8-a3c6-45d5-a185-0137dd2d9b45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Received event network-changed-fe57ed2e-f52e-4f3e-a142-07f155a77aeb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.545 183181 DEBUG nova.compute.manager [req-25cf9bf4-8306-4530-85e9-1987e4bd1bc6 req-3a6c89d8-a3c6-45d5-a185-0137dd2d9b45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Refreshing instance network info cache due to event network-changed-fe57ed2e-f52e-4f3e-a142-07f155a77aeb. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.545 183181 DEBUG oslo_concurrency.lockutils [req-25cf9bf4-8306-4530-85e9-1987e4bd1bc6 req-3a6c89d8-a3c6-45d5-a185-0137dd2d9b45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-1fb6e265-c050-4338-93f9-cbaebb25bca1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.613 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.619 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.619 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.709 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434.part --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.711 183181 DEBUG nova.virt.images [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] 34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.712 183181 DEBUG nova.privsep.utils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.713 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434.part /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.944 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434.part /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434.converted" returned: 0 in 0.231s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:35:58 compute-0 nova_compute[183177]: 2026-01-26 19:35:58.953 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:35:59 compute-0 nova_compute[183177]: 2026-01-26 19:35:59.042 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434.converted --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:35:59 compute-0 nova_compute[183177]: 2026-01-26 19:35:59.044 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.026s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:35:59 compute-0 nova_compute[183177]: 2026-01-26 19:35:59.045 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:35:59 compute-0 nova_compute[183177]: 2026-01-26 19:35:59.052 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:35:59 compute-0 nova_compute[183177]: 2026-01-26 19:35:59.055 183181 INFO oslo.privsep.daemon [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpk8f1pec3/privsep.sock']
Jan 26 19:35:59 compute-0 nova_compute[183177]: 2026-01-26 19:35:59.311 183181 DEBUG nova.network.neutron [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:35:59 compute-0 podman[192499]: time="2026-01-26T19:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:35:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:35:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2157 "" "Go-http-client/1.1"
Jan 26 19:35:59 compute-0 nova_compute[183177]: 2026-01-26 19:35:59.806 183181 WARNING neutronclient.v2_0.client [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:35:59 compute-0 nova_compute[183177]: 2026-01-26 19:35:59.916 183181 INFO oslo.privsep.daemon [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Spawned new privsep daemon via rootwrap
Jan 26 19:35:59 compute-0 nova_compute[183177]: 2026-01-26 19:35:59.709 204647 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 19:35:59 compute-0 nova_compute[183177]: 2026-01-26 19:35:59.717 204647 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 19:35:59 compute-0 nova_compute[183177]: 2026-01-26 19:35:59.720 204647 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 26 19:35:59 compute-0 nova_compute[183177]: 2026-01-26 19:35:59.721 204647 INFO oslo.privsep.daemon [-] privsep daemon running as pid 204647
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.016 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.087 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.088 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.089 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.090 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.099 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.100 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.171 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.172 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.213 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.214 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.215 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.268 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.269 183181 DEBUG nova.virt.disk.api [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Checking if we can resize image /var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.270 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.327 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.328 183181 DEBUG nova.virt.disk.api [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Cannot resize image /var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.329 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.330 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Ensure instance console log exists: /var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.330 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.331 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.331 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.451 183181 DEBUG nova.network.neutron [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Updating instance_info_cache with network_info: [{"id": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "address": "fa:16:3e:b1:17:e3", "network": {"id": "f8f391ed-cb1d-40c8-8c35-338c16ca4ecb", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1440022523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7a7c29d27ae4d23a91016528314c1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe57ed2e-f5", "ovs_interfaceid": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.968 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Releasing lock "refresh_cache-1fb6e265-c050-4338-93f9-cbaebb25bca1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.969 183181 DEBUG nova.compute.manager [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Instance network_info: |[{"id": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "address": "fa:16:3e:b1:17:e3", "network": {"id": "f8f391ed-cb1d-40c8-8c35-338c16ca4ecb", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1440022523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7a7c29d27ae4d23a91016528314c1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe57ed2e-f5", "ovs_interfaceid": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.970 183181 DEBUG oslo_concurrency.lockutils [req-25cf9bf4-8306-4530-85e9-1987e4bd1bc6 req-3a6c89d8-a3c6-45d5-a185-0137dd2d9b45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-1fb6e265-c050-4338-93f9-cbaebb25bca1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.971 183181 DEBUG nova.network.neutron [req-25cf9bf4-8306-4530-85e9-1987e4bd1bc6 req-3a6c89d8-a3c6-45d5-a185-0137dd2d9b45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Refreshing network info cache for port fe57ed2e-f52e-4f3e-a142-07f155a77aeb _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.976 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Start _get_guest_xml network_info=[{"id": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "address": "fa:16:3e:b1:17:e3", "network": {"id": "f8f391ed-cb1d-40c8-8c35-338c16ca4ecb", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1440022523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7a7c29d27ae4d23a91016528314c1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe57ed2e-f5", "ovs_interfaceid": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.983 183181 WARNING nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.986 183181 DEBUG nova.virt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestContinuousAudit-server-148216201', uuid='1fb6e265-c050-4338-93f9-cbaebb25bca1'), owner=OwnerMeta(userid='bbd88006c20f40c1aa9a87832251e17f', username='tempest-TestContinuousAudit-2036926-project-admin', projectid='3d9c65b920754bec81e0adf7f8b9ee86', projectname='tempest-TestContinuousAudit-2036926'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "address": "fa:16:3e:b1:17:e3", "network": {"id": "f8f391ed-cb1d-40c8-8c35-338c16ca4ecb", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1440022523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7a7c29d27ae4d23a91016528314c1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe57ed2e-f5", "ovs_interfaceid": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769456160.9862978) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.992 183181 DEBUG nova.virt.libvirt.host [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.993 183181 DEBUG nova.virt.libvirt.host [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.997 183181 DEBUG nova.virt.libvirt.host [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 19:36:00 compute-0 nova_compute[183177]: 2026-01-26 19:36:00.998 183181 DEBUG nova.virt.libvirt.host [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.000 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.001 183181 DEBUG nova.virt.hardware [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.002 183181 DEBUG nova.virt.hardware [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.002 183181 DEBUG nova.virt.hardware [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.002 183181 DEBUG nova.virt.hardware [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.003 183181 DEBUG nova.virt.hardware [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.003 183181 DEBUG nova.virt.hardware [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.004 183181 DEBUG nova.virt.hardware [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.004 183181 DEBUG nova.virt.hardware [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.005 183181 DEBUG nova.virt.hardware [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.005 183181 DEBUG nova.virt.hardware [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.006 183181 DEBUG nova.virt.hardware [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.012 183181 DEBUG nova.privsep.utils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.014 183181 DEBUG nova.virt.libvirt.vif [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:35:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestContinuousAudit-server-148216201',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testcontinuousaudit-server-148216201',id=1,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3d9c65b920754bec81e0adf7f8b9ee86',ramdisk_id='',reservation_id='r-h4f8k09z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestContinuousAudit-2036926',owner_user_name='tempest-TestContinuousAudit-2036926-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:35:56Z,user_data=None,user_id='bbd88006c20f40c1aa9a87832251e17f',uuid=1fb6e265-c050-4338-93f9-cbaebb25bca1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "address": "fa:16:3e:b1:17:e3", "network": {"id": "f8f391ed-cb1d-40c8-8c35-338c16ca4ecb", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1440022523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7a7c29d27ae4d23a91016528314c1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe57ed2e-f5", "ovs_interfaceid": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.014 183181 DEBUG nova.network.os_vif_util [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Converting VIF {"id": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "address": "fa:16:3e:b1:17:e3", "network": {"id": "f8f391ed-cb1d-40c8-8c35-338c16ca4ecb", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1440022523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7a7c29d27ae4d23a91016528314c1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe57ed2e-f5", "ovs_interfaceid": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.016 183181 DEBUG nova.network.os_vif_util [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:17:e3,bridge_name='br-int',has_traffic_filtering=True,id=fe57ed2e-f52e-4f3e-a142-07f155a77aeb,network=Network(f8f391ed-cb1d-40c8-8c35-338c16ca4ecb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe57ed2e-f5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.019 183181 DEBUG nova.objects.instance [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1fb6e265-c050-4338-93f9-cbaebb25bca1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:36:01 compute-0 openstack_network_exporter[195363]: ERROR   19:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:36:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:36:01 compute-0 openstack_network_exporter[195363]: ERROR   19:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:36:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.483 183181 WARNING neutronclient.v2_0.client [req-25cf9bf4-8306-4530-85e9-1987e4bd1bc6 req-3a6c89d8-a3c6-45d5-a185-0137dd2d9b45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.532 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] End _get_guest_xml xml=<domain type="kvm">
Jan 26 19:36:01 compute-0 nova_compute[183177]:   <uuid>1fb6e265-c050-4338-93f9-cbaebb25bca1</uuid>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   <name>instance-00000001</name>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <nova:name>tempest-TestContinuousAudit-server-148216201</nova:name>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:36:00</nova:creationTime>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:36:01 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:36:01 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:user uuid="bbd88006c20f40c1aa9a87832251e17f">tempest-TestContinuousAudit-2036926-project-admin</nova:user>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:project uuid="3d9c65b920754bec81e0adf7f8b9ee86">tempest-TestContinuousAudit-2036926</nova:project>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         <nova:port uuid="fe57ed2e-f52e-4f3e-a142-07f155a77aeb">
Jan 26 19:36:01 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <system>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <entry name="serial">1fb6e265-c050-4338-93f9-cbaebb25bca1</entry>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <entry name="uuid">1fb6e265-c050-4338-93f9-cbaebb25bca1</entry>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     </system>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   <os>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   </os>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   <features>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   </features>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk.config"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:b1:17:e3"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <target dev="tapfe57ed2e-f5"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/console.log" append="off"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <video>
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     </video>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:36:01 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:36:01 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:36:01 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:36:01 compute-0 nova_compute[183177]: </domain>
Jan 26 19:36:01 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.535 183181 DEBUG nova.compute.manager [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Preparing to wait for external event network-vif-plugged-fe57ed2e-f52e-4f3e-a142-07f155a77aeb prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.535 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Acquiring lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.536 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.536 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.537 183181 DEBUG nova.virt.libvirt.vif [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:35:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestContinuousAudit-server-148216201',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testcontinuousaudit-server-148216201',id=1,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3d9c65b920754bec81e0adf7f8b9ee86',ramdisk_id='',reservation_id='r-h4f8k09z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestContinuousAudit-2036926',owner_user_name='tempest-TestContinuousAudit-2036926-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:35:56Z,user_data=None,user_id='bbd88006c20f40c1aa9a87832251e17f',uuid=1fb6e265-c050-4338-93f9-cbaebb25bca1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "address": "fa:16:3e:b1:17:e3", "network": {"id": "f8f391ed-cb1d-40c8-8c35-338c16ca4ecb", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1440022523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7a7c29d27ae4d23a91016528314c1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe57ed2e-f5", "ovs_interfaceid": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.538 183181 DEBUG nova.network.os_vif_util [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Converting VIF {"id": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "address": "fa:16:3e:b1:17:e3", "network": {"id": "f8f391ed-cb1d-40c8-8c35-338c16ca4ecb", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1440022523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7a7c29d27ae4d23a91016528314c1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe57ed2e-f5", "ovs_interfaceid": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.539 183181 DEBUG nova.network.os_vif_util [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:17:e3,bridge_name='br-int',has_traffic_filtering=True,id=fe57ed2e-f52e-4f3e-a142-07f155a77aeb,network=Network(f8f391ed-cb1d-40c8-8c35-338c16ca4ecb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe57ed2e-f5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.540 183181 DEBUG os_vif [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:17:e3,bridge_name='br-int',has_traffic_filtering=True,id=fe57ed2e-f52e-4f3e-a142-07f155a77aeb,network=Network(f8f391ed-cb1d-40c8-8c35-338c16ca4ecb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe57ed2e-f5') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.602 183181 DEBUG ovsdbapp.backend.ovs_idl [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.603 183181 DEBUG ovsdbapp.backend.ovs_idl [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.603 183181 DEBUG ovsdbapp.backend.ovs_idl [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.604 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.604 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.605 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.605 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.607 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.610 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.618 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.618 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.619 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.620 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.620 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '2d00322c-4bd1-525e-a82e-b04839ad5188', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.621 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.625 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:36:01 compute-0 nova_compute[183177]: 2026-01-26 19:36:01.627 183181 INFO oslo.privsep.daemon [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp9b672sic/privsep.sock']
Jan 26 19:36:02 compute-0 nova_compute[183177]: 2026-01-26 19:36:02.367 183181 INFO oslo.privsep.daemon [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Spawned new privsep daemon via rootwrap
Jan 26 19:36:02 compute-0 nova_compute[183177]: 2026-01-26 19:36:02.220 204668 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 19:36:02 compute-0 nova_compute[183177]: 2026-01-26 19:36:02.227 204668 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 19:36:02 compute-0 nova_compute[183177]: 2026-01-26 19:36:02.230 204668 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 26 19:36:02 compute-0 nova_compute[183177]: 2026-01-26 19:36:02.230 204668 INFO oslo.privsep.daemon [-] privsep daemon running as pid 204668
Jan 26 19:36:02 compute-0 nova_compute[183177]: 2026-01-26 19:36:02.650 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:02 compute-0 nova_compute[183177]: 2026-01-26 19:36:02.651 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe57ed2e-f5, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:36:02 compute-0 nova_compute[183177]: 2026-01-26 19:36:02.651 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapfe57ed2e-f5, col_values=(('qos', UUID('d16d9a33-3db3-41ee-9712-50a91113f29a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:36:02 compute-0 nova_compute[183177]: 2026-01-26 19:36:02.653 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapfe57ed2e-f5, col_values=(('external_ids', {'iface-id': 'fe57ed2e-f52e-4f3e-a142-07f155a77aeb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:17:e3', 'vm-uuid': '1fb6e265-c050-4338-93f9-cbaebb25bca1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:36:02 compute-0 nova_compute[183177]: 2026-01-26 19:36:02.708 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:02 compute-0 NetworkManager[55489]: <info>  [1769456162.7088] manager: (tapfe57ed2e-f5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 26 19:36:02 compute-0 nova_compute[183177]: 2026-01-26 19:36:02.712 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:36:02 compute-0 nova_compute[183177]: 2026-01-26 19:36:02.717 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:02 compute-0 nova_compute[183177]: 2026-01-26 19:36:02.719 183181 INFO os_vif [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:17:e3,bridge_name='br-int',has_traffic_filtering=True,id=fe57ed2e-f52e-4f3e-a142-07f155a77aeb,network=Network(f8f391ed-cb1d-40c8-8c35-338c16ca4ecb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe57ed2e-f5')
Jan 26 19:36:03 compute-0 nova_compute[183177]: 2026-01-26 19:36:03.267 183181 WARNING neutronclient.v2_0.client [req-25cf9bf4-8306-4530-85e9-1987e4bd1bc6 req-3a6c89d8-a3c6-45d5-a185-0137dd2d9b45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:36:03 compute-0 nova_compute[183177]: 2026-01-26 19:36:03.431 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:03 compute-0 nova_compute[183177]: 2026-01-26 19:36:03.464 183181 DEBUG nova.network.neutron [req-25cf9bf4-8306-4530-85e9-1987e4bd1bc6 req-3a6c89d8-a3c6-45d5-a185-0137dd2d9b45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Updated VIF entry in instance network info cache for port fe57ed2e-f52e-4f3e-a142-07f155a77aeb. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 19:36:03 compute-0 nova_compute[183177]: 2026-01-26 19:36:03.465 183181 DEBUG nova.network.neutron [req-25cf9bf4-8306-4530-85e9-1987e4bd1bc6 req-3a6c89d8-a3c6-45d5-a185-0137dd2d9b45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Updating instance_info_cache with network_info: [{"id": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "address": "fa:16:3e:b1:17:e3", "network": {"id": "f8f391ed-cb1d-40c8-8c35-338c16ca4ecb", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1440022523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7a7c29d27ae4d23a91016528314c1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe57ed2e-f5", "ovs_interfaceid": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:36:03 compute-0 nova_compute[183177]: 2026-01-26 19:36:03.973 183181 DEBUG oslo_concurrency.lockutils [req-25cf9bf4-8306-4530-85e9-1987e4bd1bc6 req-3a6c89d8-a3c6-45d5-a185-0137dd2d9b45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-1fb6e265-c050-4338-93f9-cbaebb25bca1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:36:04 compute-0 nova_compute[183177]: 2026-01-26 19:36:04.270 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:36:04 compute-0 nova_compute[183177]: 2026-01-26 19:36:04.271 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:36:04 compute-0 nova_compute[183177]: 2026-01-26 19:36:04.271 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] No VIF found with MAC fa:16:3e:b1:17:e3, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 19:36:04 compute-0 nova_compute[183177]: 2026-01-26 19:36:04.272 183181 INFO nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Using config drive
Jan 26 19:36:04 compute-0 nova_compute[183177]: 2026-01-26 19:36:04.787 183181 WARNING neutronclient.v2_0.client [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:36:05 compute-0 nova_compute[183177]: 2026-01-26 19:36:05.452 183181 INFO nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Creating config drive at /var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk.config
Jan 26 19:36:05 compute-0 nova_compute[183177]: 2026-01-26 19:36:05.462 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpbhrur113 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:36:05 compute-0 nova_compute[183177]: 2026-01-26 19:36:05.610 183181 DEBUG oslo_concurrency.processutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpbhrur113" returned: 0 in 0.147s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:36:05 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 26 19:36:05 compute-0 kernel: tapfe57ed2e-f5: entered promiscuous mode
Jan 26 19:36:05 compute-0 NetworkManager[55489]: <info>  [1769456165.7428] manager: (tapfe57ed2e-f5): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Jan 26 19:36:05 compute-0 ovn_controller[95396]: 2026-01-26T19:36:05Z|00040|binding|INFO|Claiming lport fe57ed2e-f52e-4f3e-a142-07f155a77aeb for this chassis.
Jan 26 19:36:05 compute-0 ovn_controller[95396]: 2026-01-26T19:36:05Z|00041|binding|INFO|fe57ed2e-f52e-4f3e-a142-07f155a77aeb: Claiming fa:16:3e:b1:17:e3 10.100.0.14
Jan 26 19:36:05 compute-0 nova_compute[183177]: 2026-01-26 19:36:05.742 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:05 compute-0 nova_compute[183177]: 2026-01-26 19:36:05.747 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:05.769 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:17:e3 10.100.0.14'], port_security=['fa:16:3e:b1:17:e3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1fb6e265-c050-4338-93f9-cbaebb25bca1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d9c65b920754bec81e0adf7f8b9ee86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e9e9e42e-44b2-436c-b112-28ca6fa4a5da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d27c55d-9c87-4908-9254-adcd14eb938f, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=fe57ed2e-f52e-4f3e-a142-07f155a77aeb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:36:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:05.770 104672 INFO neutron.agent.ovn.metadata.agent [-] Port fe57ed2e-f52e-4f3e-a142-07f155a77aeb in datapath f8f391ed-cb1d-40c8-8c35-338c16ca4ecb bound to our chassis
Jan 26 19:36:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:05.772 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f8f391ed-cb1d-40c8-8c35-338c16ca4ecb
Jan 26 19:36:05 compute-0 systemd-udevd[204694]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:36:05 compute-0 NetworkManager[55489]: <info>  [1769456165.7994] device (tapfe57ed2e-f5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:36:05 compute-0 NetworkManager[55489]: <info>  [1769456165.8001] device (tapfe57ed2e-f5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:36:05 compute-0 systemd-machined[154465]: New machine qemu-1-instance-00000001.
Jan 26 19:36:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:05.824 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7f64bf-eb1a-4814-83a9-f7bf9214f88d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:05.825 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf8f391ed-c1 in ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 19:36:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:05.830 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf8f391ed-c0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 19:36:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:05.830 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2483f06f-313e-4537-a567-17f97a2cc3f0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:05.831 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f05afd50-067b-427a-966a-f4c718fdd404]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:05 compute-0 nova_compute[183177]: 2026-01-26 19:36:05.863 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:05.865 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[332b413e-5031-4f02-81eb-ba3af16d2544]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:05 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Jan 26 19:36:05 compute-0 ovn_controller[95396]: 2026-01-26T19:36:05Z|00042|binding|INFO|Setting lport fe57ed2e-f52e-4f3e-a142-07f155a77aeb ovn-installed in OVS
Jan 26 19:36:05 compute-0 ovn_controller[95396]: 2026-01-26T19:36:05Z|00043|binding|INFO|Setting lport fe57ed2e-f52e-4f3e-a142-07f155a77aeb up in Southbound
Jan 26 19:36:05 compute-0 nova_compute[183177]: 2026-01-26 19:36:05.874 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:05.890 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad3efcb-6bfa-4d55-886f-f8d4c343de26]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:05.893 104672 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp7rbx8fyt/privsep.sock']
Jan 26 19:36:06 compute-0 nova_compute[183177]: 2026-01-26 19:36:06.581 183181 DEBUG nova.compute.manager [req-ce08ec1a-4245-406b-8583-4af27e4ed607 req-26447016-03de-4a03-82c2-147ca721c9f8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Received event network-vif-plugged-fe57ed2e-f52e-4f3e-a142-07f155a77aeb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:36:06 compute-0 nova_compute[183177]: 2026-01-26 19:36:06.583 183181 DEBUG oslo_concurrency.lockutils [req-ce08ec1a-4245-406b-8583-4af27e4ed607 req-26447016-03de-4a03-82c2-147ca721c9f8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:06 compute-0 nova_compute[183177]: 2026-01-26 19:36:06.584 183181 DEBUG oslo_concurrency.lockutils [req-ce08ec1a-4245-406b-8583-4af27e4ed607 req-26447016-03de-4a03-82c2-147ca721c9f8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:06 compute-0 nova_compute[183177]: 2026-01-26 19:36:06.584 183181 DEBUG oslo_concurrency.lockutils [req-ce08ec1a-4245-406b-8583-4af27e4ed607 req-26447016-03de-4a03-82c2-147ca721c9f8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:06 compute-0 nova_compute[183177]: 2026-01-26 19:36:06.585 183181 DEBUG nova.compute.manager [req-ce08ec1a-4245-406b-8583-4af27e4ed607 req-26447016-03de-4a03-82c2-147ca721c9f8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Processing event network-vif-plugged-fe57ed2e-f52e-4f3e-a142-07f155a77aeb _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:36:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:06.636 104672 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 26 19:36:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:06.637 104672 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp7rbx8fyt/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Jan 26 19:36:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:06.465 204720 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 19:36:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:06.472 204720 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 19:36:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:06.475 204720 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 26 19:36:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:06.476 204720 INFO oslo.privsep.daemon [-] privsep daemon running as pid 204720
Jan 26 19:36:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:06.639 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4dd277-7690-485e-9453-238723cb29e3]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:06 compute-0 nova_compute[183177]: 2026-01-26 19:36:06.895 183181 DEBUG nova.compute.manager [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:36:06 compute-0 nova_compute[183177]: 2026-01-26 19:36:06.902 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 19:36:06 compute-0 nova_compute[183177]: 2026-01-26 19:36:06.920 183181 INFO nova.virt.libvirt.driver [-] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Instance spawned successfully.
Jan 26 19:36:06 compute-0 nova_compute[183177]: 2026-01-26 19:36:06.921 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.134 204720 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.134 204720 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.135 204720 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:07 compute-0 nova_compute[183177]: 2026-01-26 19:36:07.440 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:36:07 compute-0 nova_compute[183177]: 2026-01-26 19:36:07.441 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:36:07 compute-0 nova_compute[183177]: 2026-01-26 19:36:07.442 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:36:07 compute-0 nova_compute[183177]: 2026-01-26 19:36:07.443 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:36:07 compute-0 nova_compute[183177]: 2026-01-26 19:36:07.444 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:36:07 compute-0 nova_compute[183177]: 2026-01-26 19:36:07.444 183181 DEBUG nova.virt.libvirt.driver [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.586 204720 INFO oslo_service.backend [-] Loading backend: eventlet
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.591 204720 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.682 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[59704c5f-c366-45ba-a409-1fe442ec4d05]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.706 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a3aa0e44-86a7-4dbd-912c-d0ec05b9a879]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:07 compute-0 systemd-udevd[204693]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:36:07 compute-0 nova_compute[183177]: 2026-01-26 19:36:07.708 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:07 compute-0 NetworkManager[55489]: <info>  [1769456167.7093] manager: (tapf8f391ed-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.755 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[2b6d260a-532f-458e-b4c7-92b5399d3985]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.759 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[225ffc8f-3959-4f6f-9719-a35fdbbbd7fc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:07 compute-0 NetworkManager[55489]: <info>  [1769456167.7946] device (tapf8f391ed-c0): carrier: link connected
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.804 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[4dc3417d-d2e1-4168-8253-da8c26837d41]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.834 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[c74dd72e-9e09-436a-9eb6-b7c5cd43a96d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8f391ed-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:bc:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362668, 'reachable_time': 19586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 204750, 'error': None, 'target': 'ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.858 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca698ae-f24b-4d1b-b2bc-c3c79b7572d3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2e:bc9b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 362668, 'tstamp': 362668}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 204751, 'error': None, 'target': 'ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.877 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2c298f02-6f60-471d-924b-1b206926357e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8f391ed-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:bc:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362668, 'reachable_time': 19586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 204752, 'error': None, 'target': 'ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.925 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d5d932-ed83-4b08-9749-532a748e298e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:07 compute-0 nova_compute[183177]: 2026-01-26 19:36:07.962 183181 INFO nova.compute.manager [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Took 10.95 seconds to spawn the instance on the hypervisor.
Jan 26 19:36:07 compute-0 nova_compute[183177]: 2026-01-26 19:36:07.964 183181 DEBUG nova.compute.manager [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.996 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f3a11aa3-a119-4bd4-aafe-d87aba87372f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.997 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8f391ed-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.997 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:36:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:07.997 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8f391ed-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:36:08 compute-0 NetworkManager[55489]: <info>  [1769456168.0000] manager: (tapf8f391ed-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 26 19:36:08 compute-0 nova_compute[183177]: 2026-01-26 19:36:08.000 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:08 compute-0 kernel: tapf8f391ed-c0: entered promiscuous mode
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:08.004 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf8f391ed-c0, col_values=(('external_ids', {'iface-id': '4608fb84-b3a0-4b98-8d93-9834a5bb893e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:36:08 compute-0 ovn_controller[95396]: 2026-01-26T19:36:08Z|00044|binding|INFO|Releasing lport 4608fb84-b3a0-4b98-8d93-9834a5bb893e from this chassis (sb_readonly=0)
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:08.008 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2685a577-8c70-43cc-ba68-541cded890e9]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:08.009 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f8f391ed-cb1d-40c8-8c35-338c16ca4ecb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f8f391ed-cb1d-40c8-8c35-338c16ca4ecb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:08.009 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f8f391ed-cb1d-40c8-8c35-338c16ca4ecb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f8f391ed-cb1d-40c8-8c35-338c16ca4ecb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:36:08 compute-0 nova_compute[183177]: 2026-01-26 19:36:08.008 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:08.009 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f8f391ed-cb1d-40c8-8c35-338c16ca4ecb disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:08.009 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f8f391ed-cb1d-40c8-8c35-338c16ca4ecb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f8f391ed-cb1d-40c8-8c35-338c16ca4ecb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:08.010 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6600a041-3c90-4559-92e6-4ca1e0f8fbd1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:08.010 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f8f391ed-cb1d-40c8-8c35-338c16ca4ecb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f8f391ed-cb1d-40c8-8c35-338c16ca4ecb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:08.010 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[7e5dd441-4101-4da0-a15f-9bce122789ee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:08.011 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: global
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/f8f391ed-cb1d-40c8-8c35-338c16ca4ecb.pid.haproxy
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID f8f391ed-cb1d-40c8-8c35-338c16ca4ecb
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 19:36:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:08.011 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb', 'env', 'PROCESS_TAG=haproxy-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f8f391ed-cb1d-40c8-8c35-338c16ca4ecb.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 19:36:08 compute-0 nova_compute[183177]: 2026-01-26 19:36:08.023 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:08 compute-0 nova_compute[183177]: 2026-01-26 19:36:08.436 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:08 compute-0 nova_compute[183177]: 2026-01-26 19:36:08.495 183181 INFO nova.compute.manager [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Took 16.25 seconds to build instance.
Jan 26 19:36:08 compute-0 podman[204785]: 2026-01-26 19:36:08.520302755 +0000 UTC m=+0.075563935 container create 8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Jan 26 19:36:08 compute-0 podman[204785]: 2026-01-26 19:36:08.474917404 +0000 UTC m=+0.030178564 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 19:36:08 compute-0 systemd[1]: Started libpod-conmon-8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06.scope.
Jan 26 19:36:08 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:36:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b63a3f1473895a7b3da3dfa0dcbc2f0a7f963cf0c776ad8ae0e1ac4ce3ec6d5c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 19:36:08 compute-0 nova_compute[183177]: 2026-01-26 19:36:08.649 183181 DEBUG nova.compute.manager [req-34dc743d-07a6-4862-a46c-3aeff3f9f884 req-f1dde468-1b22-4772-9a59-261e427a6159 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Received event network-vif-plugged-fe57ed2e-f52e-4f3e-a142-07f155a77aeb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:36:08 compute-0 nova_compute[183177]: 2026-01-26 19:36:08.650 183181 DEBUG oslo_concurrency.lockutils [req-34dc743d-07a6-4862-a46c-3aeff3f9f884 req-f1dde468-1b22-4772-9a59-261e427a6159 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:08 compute-0 nova_compute[183177]: 2026-01-26 19:36:08.651 183181 DEBUG oslo_concurrency.lockutils [req-34dc743d-07a6-4862-a46c-3aeff3f9f884 req-f1dde468-1b22-4772-9a59-261e427a6159 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:08 compute-0 nova_compute[183177]: 2026-01-26 19:36:08.651 183181 DEBUG oslo_concurrency.lockutils [req-34dc743d-07a6-4862-a46c-3aeff3f9f884 req-f1dde468-1b22-4772-9a59-261e427a6159 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:08 compute-0 nova_compute[183177]: 2026-01-26 19:36:08.651 183181 DEBUG nova.compute.manager [req-34dc743d-07a6-4862-a46c-3aeff3f9f884 req-f1dde468-1b22-4772-9a59-261e427a6159 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] No waiting events found dispatching network-vif-plugged-fe57ed2e-f52e-4f3e-a142-07f155a77aeb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:36:08 compute-0 nova_compute[183177]: 2026-01-26 19:36:08.652 183181 WARNING nova.compute.manager [req-34dc743d-07a6-4862-a46c-3aeff3f9f884 req-f1dde468-1b22-4772-9a59-261e427a6159 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Received unexpected event network-vif-plugged-fe57ed2e-f52e-4f3e-a142-07f155a77aeb for instance with vm_state active and task_state None.
Jan 26 19:36:08 compute-0 podman[204785]: 2026-01-26 19:36:08.665025609 +0000 UTC m=+0.220286839 container init 8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4)
Jan 26 19:36:08 compute-0 podman[204785]: 2026-01-26 19:36:08.675521292 +0000 UTC m=+0.230782462 container start 8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Jan 26 19:36:08 compute-0 neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb[204800]: [NOTICE]   (204804) : New worker (204806) forked
Jan 26 19:36:08 compute-0 neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb[204800]: [NOTICE]   (204804) : Loading success.
Jan 26 19:36:09 compute-0 nova_compute[183177]: 2026-01-26 19:36:09.001 183181 DEBUG oslo_concurrency.lockutils [None req-5553064b-4d38-492a-9973-2c4e205c2c75 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.771s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:11 compute-0 podman[204815]: 2026-01-26 19:36:11.396281385 +0000 UTC m=+0.137215344 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 19:36:12 compute-0 nova_compute[183177]: 2026-01-26 19:36:12.712 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:13 compute-0 nova_compute[183177]: 2026-01-26 19:36:13.438 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:14 compute-0 podman[204841]: 2026-01-26 19:36:14.336755761 +0000 UTC m=+0.077445945 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible)
Jan 26 19:36:14 compute-0 podman[204842]: 2026-01-26 19:36:14.340291326 +0000 UTC m=+0.073697004 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120)
Jan 26 19:36:17 compute-0 nova_compute[183177]: 2026-01-26 19:36:17.717 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:18 compute-0 nova_compute[183177]: 2026-01-26 19:36:18.441 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:20 compute-0 ovn_controller[95396]: 2026-01-26T19:36:20Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:17:e3 10.100.0.14
Jan 26 19:36:20 compute-0 ovn_controller[95396]: 2026-01-26T19:36:20Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:17:e3 10.100.0.14
Jan 26 19:36:22 compute-0 nova_compute[183177]: 2026-01-26 19:36:22.721 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:23 compute-0 podman[204902]: 2026-01-26 19:36:23.337087395 +0000 UTC m=+0.076973122 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:36:23 compute-0 nova_compute[183177]: 2026-01-26 19:36:23.445 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:24.027 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:24.027 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:24.028 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.087 183181 DEBUG oslo_concurrency.lockutils [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Acquiring lock "1fb6e265-c050-4338-93f9-cbaebb25bca1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.088 183181 DEBUG oslo_concurrency.lockutils [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.089 183181 DEBUG oslo_concurrency.lockutils [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Acquiring lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.089 183181 DEBUG oslo_concurrency.lockutils [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.090 183181 DEBUG oslo_concurrency.lockutils [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.106 183181 INFO nova.compute.manager [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Terminating instance
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.626 183181 DEBUG nova.compute.manager [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 19:36:25 compute-0 kernel: tapfe57ed2e-f5 (unregistering): left promiscuous mode
Jan 26 19:36:25 compute-0 NetworkManager[55489]: <info>  [1769456185.6554] device (tapfe57ed2e-f5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:36:25 compute-0 ovn_controller[95396]: 2026-01-26T19:36:25Z|00045|binding|INFO|Releasing lport fe57ed2e-f52e-4f3e-a142-07f155a77aeb from this chassis (sb_readonly=0)
Jan 26 19:36:25 compute-0 ovn_controller[95396]: 2026-01-26T19:36:25Z|00046|binding|INFO|Setting lport fe57ed2e-f52e-4f3e-a142-07f155a77aeb down in Southbound
Jan 26 19:36:25 compute-0 ovn_controller[95396]: 2026-01-26T19:36:25Z|00047|binding|INFO|Removing iface tapfe57ed2e-f5 ovn-installed in OVS
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.670 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.673 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:25.682 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:17:e3 10.100.0.14'], port_security=['fa:16:3e:b1:17:e3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1fb6e265-c050-4338-93f9-cbaebb25bca1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d9c65b920754bec81e0adf7f8b9ee86', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e9e9e42e-44b2-436c-b112-28ca6fa4a5da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d27c55d-9c87-4908-9254-adcd14eb938f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=fe57ed2e-f52e-4f3e-a142-07f155a77aeb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:36:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:25.684 104672 INFO neutron.agent.ovn.metadata.agent [-] Port fe57ed2e-f52e-4f3e-a142-07f155a77aeb in datapath f8f391ed-cb1d-40c8-8c35-338c16ca4ecb unbound from our chassis
Jan 26 19:36:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:25.685 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f8f391ed-cb1d-40c8-8c35-338c16ca4ecb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:36:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:25.687 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[506013a3-9b99-411e-a578-474a39b55d13]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:25.687 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb namespace which is not needed anymore
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.690 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:25 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Jan 26 19:36:25 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 13.876s CPU time.
Jan 26 19:36:25 compute-0 systemd-machined[154465]: Machine qemu-1-instance-00000001 terminated.
Jan 26 19:36:25 compute-0 neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb[204800]: [NOTICE]   (204804) : haproxy version is 3.0.5-8e879a5
Jan 26 19:36:25 compute-0 neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb[204800]: [NOTICE]   (204804) : path to executable is /usr/sbin/haproxy
Jan 26 19:36:25 compute-0 neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb[204800]: [WARNING]  (204804) : Exiting Master process...
Jan 26 19:36:25 compute-0 podman[204953]: 2026-01-26 19:36:25.845964807 +0000 UTC m=+0.047768806 container kill 8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 19:36:25 compute-0 neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb[204800]: [ALERT]    (204804) : Current worker (204806) exited with code 143 (Terminated)
Jan 26 19:36:25 compute-0 neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb[204800]: [WARNING]  (204804) : All workers exited. Exiting... (0)
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.848 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:25 compute-0 systemd[1]: libpod-8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06.scope: Deactivated successfully.
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.853 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.887 183181 INFO nova.virt.libvirt.driver [-] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Instance destroyed successfully.
Jan 26 19:36:25 compute-0 nova_compute[183177]: 2026-01-26 19:36:25.888 183181 DEBUG nova.objects.instance [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lazy-loading 'resources' on Instance uuid 1fb6e265-c050-4338-93f9-cbaebb25bca1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:36:25 compute-0 podman[204972]: 2026-01-26 19:36:25.92371625 +0000 UTC m=+0.051138688 container died 8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS)
Jan 26 19:36:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06-userdata-shm.mount: Deactivated successfully.
Jan 26 19:36:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-b63a3f1473895a7b3da3dfa0dcbc2f0a7f963cf0c776ad8ae0e1ac4ce3ec6d5c-merged.mount: Deactivated successfully.
Jan 26 19:36:25 compute-0 podman[204972]: 2026-01-26 19:36:25.960589652 +0000 UTC m=+0.088012050 container cleanup 8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Jan 26 19:36:25 compute-0 systemd[1]: libpod-conmon-8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06.scope: Deactivated successfully.
Jan 26 19:36:25 compute-0 podman[204980]: 2026-01-26 19:36:25.988707058 +0000 UTC m=+0.096824636 container remove 8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:36:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:25.997 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a35bfb6e-fb0c-4677-b37c-b83c9501a2a9]: (4, ("Mon Jan 26 07:36:25 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb (8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06)\n8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06\nMon Jan 26 07:36:25 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb (8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06)\n8ee38bb917d5e9379398a00e3a4a542f1865f754486d9e764c6ab9a630aa3a06\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:25.998 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[aae0df4e-be42-4205-bac8-16d694b31956]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:25.999 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f8f391ed-cb1d-40c8-8c35-338c16ca4ecb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f8f391ed-cb1d-40c8-8c35-338c16ca4ecb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:36:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:25.999 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e80a6732-19d1-4805-8528-0295ca450e80]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:26.000 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8f391ed-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.003 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:26 compute-0 kernel: tapf8f391ed-c0: left promiscuous mode
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.037 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:26.039 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2becbbc6-fbe3-4a59-8c5a-da487d81d6d0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:26.057 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[de64b8d8-29c1-4ef9-818f-91727047acfa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:26.058 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3b00ae99-b1c8-42b6-8792-cfd5cd6174c0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:26.083 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[7b12998d-c967-4c56-8ca0-6f2226cccb15]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362656, 'reachable_time': 23136, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 205018, 'error': None, 'target': 'ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:26 compute-0 systemd[1]: run-netns-ovnmeta\x2df8f391ed\x2dcb1d\x2d40c8\x2d8c35\x2d338c16ca4ecb.mount: Deactivated successfully.
Jan 26 19:36:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:26.091 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f8f391ed-cb1d-40c8-8c35-338c16ca4ecb deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 19:36:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:26.093 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[c4fb4fae-619b-4ba8-a1b2-7de627d0eecc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.396 183181 DEBUG nova.virt.libvirt.vif [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:35:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestContinuousAudit-server-148216201',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testcontinuousaudit-server-148216201',id=1,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:36:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3d9c65b920754bec81e0adf7f8b9ee86',ramdisk_id='',reservation_id='r-h4f8k09z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestContinuousAudit-2036926',owner_user_name='tempest-TestContinuousAudit-2036926-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:36:07Z,user_data=None,user_id='bbd88006c20f40c1aa9a87832251e17f',uuid=1fb6e265-c050-4338-93f9-cbaebb25bca1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "address": "fa:16:3e:b1:17:e3", "network": {"id": "f8f391ed-cb1d-40c8-8c35-338c16ca4ecb", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1440022523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7a7c29d27ae4d23a91016528314c1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe57ed2e-f5", "ovs_interfaceid": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.397 183181 DEBUG nova.network.os_vif_util [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Converting VIF {"id": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "address": "fa:16:3e:b1:17:e3", "network": {"id": "f8f391ed-cb1d-40c8-8c35-338c16ca4ecb", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1440022523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7a7c29d27ae4d23a91016528314c1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe57ed2e-f5", "ovs_interfaceid": "fe57ed2e-f52e-4f3e-a142-07f155a77aeb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.397 183181 DEBUG nova.network.os_vif_util [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:17:e3,bridge_name='br-int',has_traffic_filtering=True,id=fe57ed2e-f52e-4f3e-a142-07f155a77aeb,network=Network(f8f391ed-cb1d-40c8-8c35-338c16ca4ecb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe57ed2e-f5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.397 183181 DEBUG os_vif [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:17:e3,bridge_name='br-int',has_traffic_filtering=True,id=fe57ed2e-f52e-4f3e-a142-07f155a77aeb,network=Network(f8f391ed-cb1d-40c8-8c35-338c16ca4ecb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe57ed2e-f5') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.399 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.399 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe57ed2e-f5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.401 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.402 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.403 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.403 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d16d9a33-3db3-41ee-9712-50a91113f29a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.403 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.405 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.407 183181 INFO os_vif [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:17:e3,bridge_name='br-int',has_traffic_filtering=True,id=fe57ed2e-f52e-4f3e-a142-07f155a77aeb,network=Network(f8f391ed-cb1d-40c8-8c35-338c16ca4ecb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe57ed2e-f5')
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.408 183181 INFO nova.virt.libvirt.driver [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Deleting instance files /var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1_del
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.409 183181 INFO nova.virt.libvirt.driver [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Deletion of /var/lib/nova/instances/1fb6e265-c050-4338-93f9-cbaebb25bca1_del complete
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.455 183181 DEBUG nova.compute.manager [req-4d8774e2-3f55-4e28-8dda-56a5e4682fa2 req-08c3d94f-5db9-4840-a1df-035257aafe9f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Received event network-vif-unplugged-fe57ed2e-f52e-4f3e-a142-07f155a77aeb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.455 183181 DEBUG oslo_concurrency.lockutils [req-4d8774e2-3f55-4e28-8dda-56a5e4682fa2 req-08c3d94f-5db9-4840-a1df-035257aafe9f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.455 183181 DEBUG oslo_concurrency.lockutils [req-4d8774e2-3f55-4e28-8dda-56a5e4682fa2 req-08c3d94f-5db9-4840-a1df-035257aafe9f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.456 183181 DEBUG oslo_concurrency.lockutils [req-4d8774e2-3f55-4e28-8dda-56a5e4682fa2 req-08c3d94f-5db9-4840-a1df-035257aafe9f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.456 183181 DEBUG nova.compute.manager [req-4d8774e2-3f55-4e28-8dda-56a5e4682fa2 req-08c3d94f-5db9-4840-a1df-035257aafe9f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] No waiting events found dispatching network-vif-unplugged-fe57ed2e-f52e-4f3e-a142-07f155a77aeb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.456 183181 DEBUG nova.compute.manager [req-4d8774e2-3f55-4e28-8dda-56a5e4682fa2 req-08c3d94f-5db9-4840-a1df-035257aafe9f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Received event network-vif-unplugged-fe57ed2e-f52e-4f3e-a142-07f155a77aeb for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:36:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:26.521 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.521 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:26.522 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.928 183181 INFO nova.compute.manager [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Took 1.30 seconds to destroy the instance on the hypervisor.
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.929 183181 DEBUG oslo.service.backend._eventlet.loopingcall [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.929 183181 DEBUG nova.compute.manager [-] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.929 183181 DEBUG nova.network.neutron [-] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 19:36:26 compute-0 nova_compute[183177]: 2026-01-26 19:36:26.929 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:36:27 compute-0 nova_compute[183177]: 2026-01-26 19:36:27.306 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:36:28 compute-0 nova_compute[183177]: 2026-01-26 19:36:28.316 183181 DEBUG nova.network.neutron [-] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:36:28 compute-0 nova_compute[183177]: 2026-01-26 19:36:28.481 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:28 compute-0 nova_compute[183177]: 2026-01-26 19:36:28.540 183181 DEBUG nova.compute.manager [req-2365f2af-4b14-4ccc-b395-c65bb88f83d4 req-d520acd3-a157-4ae4-a876-27fdc3999866 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Received event network-vif-unplugged-fe57ed2e-f52e-4f3e-a142-07f155a77aeb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:36:28 compute-0 nova_compute[183177]: 2026-01-26 19:36:28.541 183181 DEBUG oslo_concurrency.lockutils [req-2365f2af-4b14-4ccc-b395-c65bb88f83d4 req-d520acd3-a157-4ae4-a876-27fdc3999866 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:28 compute-0 nova_compute[183177]: 2026-01-26 19:36:28.542 183181 DEBUG oslo_concurrency.lockutils [req-2365f2af-4b14-4ccc-b395-c65bb88f83d4 req-d520acd3-a157-4ae4-a876-27fdc3999866 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:28 compute-0 nova_compute[183177]: 2026-01-26 19:36:28.542 183181 DEBUG oslo_concurrency.lockutils [req-2365f2af-4b14-4ccc-b395-c65bb88f83d4 req-d520acd3-a157-4ae4-a876-27fdc3999866 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:28 compute-0 nova_compute[183177]: 2026-01-26 19:36:28.542 183181 DEBUG nova.compute.manager [req-2365f2af-4b14-4ccc-b395-c65bb88f83d4 req-d520acd3-a157-4ae4-a876-27fdc3999866 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] No waiting events found dispatching network-vif-unplugged-fe57ed2e-f52e-4f3e-a142-07f155a77aeb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:36:28 compute-0 nova_compute[183177]: 2026-01-26 19:36:28.543 183181 DEBUG nova.compute.manager [req-2365f2af-4b14-4ccc-b395-c65bb88f83d4 req-d520acd3-a157-4ae4-a876-27fdc3999866 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Received event network-vif-unplugged-fe57ed2e-f52e-4f3e-a142-07f155a77aeb for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:36:28 compute-0 nova_compute[183177]: 2026-01-26 19:36:28.543 183181 DEBUG nova.compute.manager [req-2365f2af-4b14-4ccc-b395-c65bb88f83d4 req-d520acd3-a157-4ae4-a876-27fdc3999866 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Received event network-vif-deleted-fe57ed2e-f52e-4f3e-a142-07f155a77aeb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:36:28 compute-0 nova_compute[183177]: 2026-01-26 19:36:28.823 183181 INFO nova.compute.manager [-] [instance: 1fb6e265-c050-4338-93f9-cbaebb25bca1] Took 1.89 seconds to deallocate network for instance.
Jan 26 19:36:29 compute-0 nova_compute[183177]: 2026-01-26 19:36:29.355 183181 DEBUG oslo_concurrency.lockutils [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:29 compute-0 nova_compute[183177]: 2026-01-26 19:36:29.356 183181 DEBUG oslo_concurrency.lockutils [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:29 compute-0 nova_compute[183177]: 2026-01-26 19:36:29.717 183181 DEBUG nova.compute.provider_tree [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 19:36:29 compute-0 podman[192499]: time="2026-01-26T19:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:36:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:36:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 26 19:36:30 compute-0 nova_compute[183177]: 2026-01-26 19:36:30.248 183181 ERROR nova.scheduler.client.report [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] [req-7280c2bf-414c-4c11-bc9f-970cbca5348b] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID a47e311f-639f-4d60-b79d-85bbf53e2f35.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-7280c2bf-414c-4c11-bc9f-970cbca5348b"}]}
Jan 26 19:36:30 compute-0 nova_compute[183177]: 2026-01-26 19:36:30.267 183181 DEBUG nova.scheduler.client.report [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Refreshing inventories for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 19:36:30 compute-0 nova_compute[183177]: 2026-01-26 19:36:30.283 183181 DEBUG nova.scheduler.client.report [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Updating ProviderTree inventory for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 19:36:30 compute-0 nova_compute[183177]: 2026-01-26 19:36:30.283 183181 DEBUG nova.compute.provider_tree [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 19:36:30 compute-0 nova_compute[183177]: 2026-01-26 19:36:30.294 183181 DEBUG nova.scheduler.client.report [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Refreshing aggregate associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 19:36:30 compute-0 nova_compute[183177]: 2026-01-26 19:36:30.313 183181 DEBUG nova.scheduler.client.report [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Refreshing trait associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_SCSI,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 19:36:30 compute-0 nova_compute[183177]: 2026-01-26 19:36:30.356 183181 DEBUG nova.compute.provider_tree [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 19:36:30 compute-0 nova_compute[183177]: 2026-01-26 19:36:30.893 183181 DEBUG nova.scheduler.client.report [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Updated inventory for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Jan 26 19:36:30 compute-0 nova_compute[183177]: 2026-01-26 19:36:30.894 183181 DEBUG nova.compute.provider_tree [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Updating resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 19:36:30 compute-0 nova_compute[183177]: 2026-01-26 19:36:30.894 183181 DEBUG nova.compute.provider_tree [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 19:36:31 compute-0 nova_compute[183177]: 2026-01-26 19:36:31.404 183181 DEBUG oslo_concurrency.lockutils [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.048s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:31 compute-0 nova_compute[183177]: 2026-01-26 19:36:31.406 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:31 compute-0 openstack_network_exporter[195363]: ERROR   19:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:36:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:36:31 compute-0 openstack_network_exporter[195363]: ERROR   19:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:36:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:36:31 compute-0 nova_compute[183177]: 2026-01-26 19:36:31.793 183181 INFO nova.scheduler.client.report [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Deleted allocations for instance 1fb6e265-c050-4338-93f9-cbaebb25bca1
Jan 26 19:36:32 compute-0 nova_compute[183177]: 2026-01-26 19:36:32.827 183181 DEBUG oslo_concurrency.lockutils [None req-8834f11c-149e-42da-ba89-7c0364b2bb89 bbd88006c20f40c1aa9a87832251e17f 3d9c65b920754bec81e0adf7f8b9ee86 - - default default] Lock "1fb6e265-c050-4338-93f9-cbaebb25bca1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.739s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:33 compute-0 nova_compute[183177]: 2026-01-26 19:36:33.515 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:33.524 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:36:36 compute-0 nova_compute[183177]: 2026-01-26 19:36:36.407 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:38 compute-0 nova_compute[183177]: 2026-01-26 19:36:38.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:36:38 compute-0 nova_compute[183177]: 2026-01-26 19:36:38.518 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:38 compute-0 nova_compute[183177]: 2026-01-26 19:36:38.664 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:38 compute-0 nova_compute[183177]: 2026-01-26 19:36:38.665 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:38 compute-0 nova_compute[183177]: 2026-01-26 19:36:38.665 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:38 compute-0 nova_compute[183177]: 2026-01-26 19:36:38.666 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:36:38 compute-0 nova_compute[183177]: 2026-01-26 19:36:38.849 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:36:38 compute-0 nova_compute[183177]: 2026-01-26 19:36:38.850 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:36:38 compute-0 nova_compute[183177]: 2026-01-26 19:36:38.873 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:36:38 compute-0 nova_compute[183177]: 2026-01-26 19:36:38.874 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5814MB free_disk=73.10329818725586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:36:38 compute-0 nova_compute[183177]: 2026-01-26 19:36:38.874 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:36:38 compute-0 nova_compute[183177]: 2026-01-26 19:36:38.875 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:36:39 compute-0 nova_compute[183177]: 2026-01-26 19:36:39.925 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:36:39 compute-0 nova_compute[183177]: 2026-01-26 19:36:39.925 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:36:38 up  1:00,  0 user,  load average: 0.26, 0.38, 0.53\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:36:39 compute-0 nova_compute[183177]: 2026-01-26 19:36:39.951 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:36:40 compute-0 nova_compute[183177]: 2026-01-26 19:36:40.457 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:36:40 compute-0 nova_compute[183177]: 2026-01-26 19:36:40.572 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:40 compute-0 nova_compute[183177]: 2026-01-26 19:36:40.968 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:36:40 compute-0 nova_compute[183177]: 2026-01-26 19:36:40.969 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:36:41 compute-0 nova_compute[183177]: 2026-01-26 19:36:41.409 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:42 compute-0 podman[205025]: 2026-01-26 19:36:42.355733953 +0000 UTC m=+0.105497310 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:36:42 compute-0 nova_compute[183177]: 2026-01-26 19:36:42.970 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:36:42 compute-0 nova_compute[183177]: 2026-01-26 19:36:42.971 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:36:42 compute-0 nova_compute[183177]: 2026-01-26 19:36:42.971 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:36:42 compute-0 nova_compute[183177]: 2026-01-26 19:36:42.971 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:36:43 compute-0 nova_compute[183177]: 2026-01-26 19:36:43.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:36:43 compute-0 nova_compute[183177]: 2026-01-26 19:36:43.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:36:43 compute-0 nova_compute[183177]: 2026-01-26 19:36:43.560 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:45 compute-0 nova_compute[183177]: 2026-01-26 19:36:45.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:36:45 compute-0 nova_compute[183177]: 2026-01-26 19:36:45.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:36:45 compute-0 podman[205053]: 2026-01-26 19:36:45.33523623 +0000 UTC m=+0.081945017 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 19:36:45 compute-0 podman[205052]: 2026-01-26 19:36:45.345391973 +0000 UTC m=+0.090277011 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter)
Jan 26 19:36:46 compute-0 nova_compute[183177]: 2026-01-26 19:36:46.412 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:48 compute-0 nova_compute[183177]: 2026-01-26 19:36:48.563 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:50 compute-0 sshd-session[205093]: Invalid user admin from 193.32.162.151 port 38362
Jan 26 19:36:50 compute-0 sshd-session[205093]: Connection closed by invalid user admin 193.32.162.151 port 38362 [preauth]
Jan 26 19:36:51 compute-0 nova_compute[183177]: 2026-01-26 19:36:51.414 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:53.344 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:a8:a0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-414e7f05-0834-4508-b4cf-6d4a0b570b9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25a4730ad48c4354a0553a82065b2a70', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d064364a-4825-4a41-ba3d-524a7bff4047, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=56efd16c-bb3e-4b64-8c74-e0372eb8bd16) old=Port_Binding(mac=['fa:16:3e:f7:a8:a0'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-414e7f05-0834-4508-b4cf-6d4a0b570b9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25a4730ad48c4354a0553a82065b2a70', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:36:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:53.345 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 56efd16c-bb3e-4b64-8c74-e0372eb8bd16 in datapath 414e7f05-0834-4508-b4cf-6d4a0b570b9e updated
Jan 26 19:36:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:53.347 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 414e7f05-0834-4508-b4cf-6d4a0b570b9e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:36:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:36:53.348 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[130ac67d-e738-465a-924a-d1292849e0bc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:36:53 compute-0 nova_compute[183177]: 2026-01-26 19:36:53.598 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:54 compute-0 podman[205095]: 2026-01-26 19:36:54.30580966 +0000 UTC m=+0.058107885 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:36:56 compute-0 nova_compute[183177]: 2026-01-26 19:36:56.417 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:58 compute-0 nova_compute[183177]: 2026-01-26 19:36:58.601 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:36:59 compute-0 podman[192499]: time="2026-01-26T19:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:36:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:36:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 26 19:37:01 compute-0 nova_compute[183177]: 2026-01-26 19:37:01.419 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:01 compute-0 openstack_network_exporter[195363]: ERROR   19:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:37:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:37:01 compute-0 openstack_network_exporter[195363]: ERROR   19:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:37:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:37:03 compute-0 nova_compute[183177]: 2026-01-26 19:37:03.638 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:05.351 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:b3:f4 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-576b43fa-68fe-4e8d-a5ea-b2a26501750d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576b43fa-68fe-4e8d-a5ea-b2a26501750d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f835ebdba84c44c2a95961eb13570992', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93d7dd1f-3f0c-4a1a-8652-6e2dc86bba50, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bc5f1367-2ede-48e4-add1-2de61b20998c) old=Port_Binding(mac=['fa:16:3e:4d:b3:f4'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-576b43fa-68fe-4e8d-a5ea-b2a26501750d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576b43fa-68fe-4e8d-a5ea-b2a26501750d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f835ebdba84c44c2a95961eb13570992', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:37:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:05.352 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bc5f1367-2ede-48e4-add1-2de61b20998c in datapath 576b43fa-68fe-4e8d-a5ea-b2a26501750d updated
Jan 26 19:37:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:05.354 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 576b43fa-68fe-4e8d-a5ea-b2a26501750d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:37:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:05.355 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[bc353706-e280-4cfd-a3fa-5159c963c0a3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:06 compute-0 nova_compute[183177]: 2026-01-26 19:37:06.422 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:08 compute-0 nova_compute[183177]: 2026-01-26 19:37:08.639 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:11 compute-0 nova_compute[183177]: 2026-01-26 19:37:11.424 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:13 compute-0 podman[205119]: 2026-01-26 19:37:13.395517286 +0000 UTC m=+0.150138020 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 26 19:37:13 compute-0 nova_compute[183177]: 2026-01-26 19:37:13.640 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:14 compute-0 ovn_controller[95396]: 2026-01-26T19:37:14Z|00048|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 19:37:16 compute-0 podman[205146]: 2026-01-26 19:37:16.349028242 +0000 UTC m=+0.088822002 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7)
Jan 26 19:37:16 compute-0 podman[205147]: 2026-01-26 19:37:16.37437675 +0000 UTC m=+0.108956244 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 19:37:16 compute-0 nova_compute[183177]: 2026-01-26 19:37:16.426 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:18 compute-0 nova_compute[183177]: 2026-01-26 19:37:18.059 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Acquiring lock "3316d5bf-2fc3-439d-be93-54696ee605b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:37:18 compute-0 nova_compute[183177]: 2026-01-26 19:37:18.059 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:37:18 compute-0 nova_compute[183177]: 2026-01-26 19:37:18.563 183181 DEBUG nova.compute.manager [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 19:37:18 compute-0 nova_compute[183177]: 2026-01-26 19:37:18.643 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:19 compute-0 nova_compute[183177]: 2026-01-26 19:37:19.127 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:37:19 compute-0 nova_compute[183177]: 2026-01-26 19:37:19.128 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:37:19 compute-0 nova_compute[183177]: 2026-01-26 19:37:19.137 183181 DEBUG nova.virt.hardware [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 19:37:19 compute-0 nova_compute[183177]: 2026-01-26 19:37:19.137 183181 INFO nova.compute.claims [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Claim successful on node compute-0.ctlplane.example.com
Jan 26 19:37:20 compute-0 nova_compute[183177]: 2026-01-26 19:37:20.195 183181 DEBUG nova.compute.provider_tree [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:37:20 compute-0 nova_compute[183177]: 2026-01-26 19:37:20.714 183181 DEBUG nova.scheduler.client.report [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:37:21 compute-0 nova_compute[183177]: 2026-01-26 19:37:21.226 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:37:21 compute-0 nova_compute[183177]: 2026-01-26 19:37:21.228 183181 DEBUG nova.compute.manager [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 19:37:21 compute-0 nova_compute[183177]: 2026-01-26 19:37:21.428 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:21 compute-0 nova_compute[183177]: 2026-01-26 19:37:21.742 183181 DEBUG nova.compute.manager [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 19:37:21 compute-0 nova_compute[183177]: 2026-01-26 19:37:21.742 183181 DEBUG nova.network.neutron [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 19:37:21 compute-0 nova_compute[183177]: 2026-01-26 19:37:21.743 183181 WARNING neutronclient.v2_0.client [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:37:21 compute-0 nova_compute[183177]: 2026-01-26 19:37:21.744 183181 WARNING neutronclient.v2_0.client [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:37:22 compute-0 nova_compute[183177]: 2026-01-26 19:37:22.252 183181 INFO nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 19:37:22 compute-0 nova_compute[183177]: 2026-01-26 19:37:22.597 183181 DEBUG nova.network.neutron [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Successfully created port: ab0e96d7-c389-4db4-8ce6-f189706fb705 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 19:37:22 compute-0 nova_compute[183177]: 2026-01-26 19:37:22.762 183181 DEBUG nova.compute.manager [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.645 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.652 183181 DEBUG nova.network.neutron [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Successfully updated port: ab0e96d7-c389-4db4-8ce6-f189706fb705 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.707 183181 DEBUG nova.compute.manager [req-c3a87871-e712-4d5b-8319-9ce2180f6c99 req-076f1af1-7f66-4d15-a29a-85aaa014c1d0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Received event network-changed-ab0e96d7-c389-4db4-8ce6-f189706fb705 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.708 183181 DEBUG nova.compute.manager [req-c3a87871-e712-4d5b-8319-9ce2180f6c99 req-076f1af1-7f66-4d15-a29a-85aaa014c1d0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Refreshing instance network info cache due to event network-changed-ab0e96d7-c389-4db4-8ce6-f189706fb705. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.709 183181 DEBUG oslo_concurrency.lockutils [req-c3a87871-e712-4d5b-8319-9ce2180f6c99 req-076f1af1-7f66-4d15-a29a-85aaa014c1d0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-3316d5bf-2fc3-439d-be93-54696ee605b1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.709 183181 DEBUG oslo_concurrency.lockutils [req-c3a87871-e712-4d5b-8319-9ce2180f6c99 req-076f1af1-7f66-4d15-a29a-85aaa014c1d0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-3316d5bf-2fc3-439d-be93-54696ee605b1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.709 183181 DEBUG nova.network.neutron [req-c3a87871-e712-4d5b-8319-9ce2180f6c99 req-076f1af1-7f66-4d15-a29a-85aaa014c1d0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Refreshing network info cache for port ab0e96d7-c389-4db4-8ce6-f189706fb705 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.791 183181 DEBUG nova.compute.manager [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.793 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.794 183181 INFO nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Creating image(s)
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.795 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Acquiring lock "/var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.796 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "/var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.797 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "/var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.798 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.804 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.807 183181 DEBUG oslo_concurrency.processutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.880 183181 DEBUG oslo_concurrency.processutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.881 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.882 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.883 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.890 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.892 183181 DEBUG oslo_concurrency.processutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.968 183181 DEBUG oslo_concurrency.processutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:37:23 compute-0 nova_compute[183177]: 2026-01-26 19:37:23.969 183181 DEBUG oslo_concurrency.processutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.001 183181 DEBUG oslo_concurrency.processutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.002 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.002 183181 DEBUG oslo_concurrency.processutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:37:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:24.029 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:37:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:24.030 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:37:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:24.030 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.060 183181 DEBUG oslo_concurrency.processutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.061 183181 DEBUG nova.virt.disk.api [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Checking if we can resize image /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.062 183181 DEBUG oslo_concurrency.processutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.119 183181 DEBUG oslo_concurrency.processutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.120 183181 DEBUG nova.virt.disk.api [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Cannot resize image /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.121 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.122 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Ensure instance console log exists: /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.122 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.123 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.123 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.159 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Acquiring lock "refresh_cache-3316d5bf-2fc3-439d-be93-54696ee605b1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.218 183181 WARNING neutronclient.v2_0.client [req-c3a87871-e712-4d5b-8319-9ce2180f6c99 req-076f1af1-7f66-4d15-a29a-85aaa014c1d0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.320 183181 DEBUG nova.network.neutron [req-c3a87871-e712-4d5b-8319-9ce2180f6c99 req-076f1af1-7f66-4d15-a29a-85aaa014c1d0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:37:24 compute-0 nova_compute[183177]: 2026-01-26 19:37:24.696 183181 DEBUG nova.network.neutron [req-c3a87871-e712-4d5b-8319-9ce2180f6c99 req-076f1af1-7f66-4d15-a29a-85aaa014c1d0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:37:25 compute-0 nova_compute[183177]: 2026-01-26 19:37:25.204 183181 DEBUG oslo_concurrency.lockutils [req-c3a87871-e712-4d5b-8319-9ce2180f6c99 req-076f1af1-7f66-4d15-a29a-85aaa014c1d0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-3316d5bf-2fc3-439d-be93-54696ee605b1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:37:25 compute-0 nova_compute[183177]: 2026-01-26 19:37:25.206 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Acquired lock "refresh_cache-3316d5bf-2fc3-439d-be93-54696ee605b1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:37:25 compute-0 nova_compute[183177]: 2026-01-26 19:37:25.206 183181 DEBUG nova.network.neutron [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:37:25 compute-0 podman[205202]: 2026-01-26 19:37:25.34552898 +0000 UTC m=+0.084025796 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 19:37:26 compute-0 nova_compute[183177]: 2026-01-26 19:37:26.345 183181 DEBUG nova.network.neutron [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:37:26 compute-0 nova_compute[183177]: 2026-01-26 19:37:26.430 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:26 compute-0 nova_compute[183177]: 2026-01-26 19:37:26.910 183181 WARNING neutronclient.v2_0.client [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.153 183181 DEBUG nova.network.neutron [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Updating instance_info_cache with network_info: [{"id": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "address": "fa:16:3e:30:23:34", "network": {"id": "414e7f05-0834-4508-b4cf-6d4a0b570b9e", "bridge": "br-int", "label": "tempest-TestDataModel-700287017-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a4730ad48c4354a0553a82065b2a70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0e96d7-c3", "ovs_interfaceid": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.661 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Releasing lock "refresh_cache-3316d5bf-2fc3-439d-be93-54696ee605b1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.662 183181 DEBUG nova.compute.manager [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Instance network_info: |[{"id": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "address": "fa:16:3e:30:23:34", "network": {"id": "414e7f05-0834-4508-b4cf-6d4a0b570b9e", "bridge": "br-int", "label": "tempest-TestDataModel-700287017-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a4730ad48c4354a0553a82065b2a70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0e96d7-c3", "ovs_interfaceid": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.664 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Start _get_guest_xml network_info=[{"id": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "address": "fa:16:3e:30:23:34", "network": {"id": "414e7f05-0834-4508-b4cf-6d4a0b570b9e", "bridge": "br-int", "label": "tempest-TestDataModel-700287017-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a4730ad48c4354a0553a82065b2a70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0e96d7-c3", "ovs_interfaceid": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.669 183181 WARNING nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.670 183181 DEBUG nova.virt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestDataModel-server-563859767', uuid='3316d5bf-2fc3-439d-be93-54696ee605b1'), owner=OwnerMeta(userid='d5f58385817047fdb78488b13ec067ee', username='tempest-TestDataModel-933456558-project-admin', projectid='f835ebdba84c44c2a95961eb13570992', projectname='tempest-TestDataModel-933456558'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "address": "fa:16:3e:30:23:34", "network": {"id": "414e7f05-0834-4508-b4cf-6d4a0b570b9e", "bridge": "br-int", "label": "tempest-TestDataModel-700287017-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a4730ad48c4354a0553a82065b2a70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0e96d7-c3", "ovs_interfaceid": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769456247.6706848) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.674 183181 DEBUG nova.virt.libvirt.host [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.675 183181 DEBUG nova.virt.libvirt.host [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.678 183181 DEBUG nova.virt.libvirt.host [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.679 183181 DEBUG nova.virt.libvirt.host [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.680 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.681 183181 DEBUG nova.virt.hardware [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.681 183181 DEBUG nova.virt.hardware [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.681 183181 DEBUG nova.virt.hardware [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.681 183181 DEBUG nova.virt.hardware [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.682 183181 DEBUG nova.virt.hardware [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.682 183181 DEBUG nova.virt.hardware [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.682 183181 DEBUG nova.virt.hardware [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.682 183181 DEBUG nova.virt.hardware [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.683 183181 DEBUG nova.virt.hardware [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.683 183181 DEBUG nova.virt.hardware [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.683 183181 DEBUG nova.virt.hardware [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.689 183181 DEBUG nova.virt.libvirt.vif [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:37:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-563859767',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-563859767',id=2,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f835ebdba84c44c2a95961eb13570992',ramdisk_id='',reservation_id='r-qlqbfw8u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-933456558',owner_user_name='tempest-TestDataModel-933456558-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:37:22Z,user_data=None,user_id='d5f58385817047fdb78488b13ec067ee',uuid=3316d5bf-2fc3-439d-be93-54696ee605b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "address": "fa:16:3e:30:23:34", "network": {"id": "414e7f05-0834-4508-b4cf-6d4a0b570b9e", "bridge": "br-int", "label": "tempest-TestDataModel-700287017-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a4730ad48c4354a0553a82065b2a70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0e96d7-c3", "ovs_interfaceid": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.690 183181 DEBUG nova.network.os_vif_util [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Converting VIF {"id": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "address": "fa:16:3e:30:23:34", "network": {"id": "414e7f05-0834-4508-b4cf-6d4a0b570b9e", "bridge": "br-int", "label": "tempest-TestDataModel-700287017-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a4730ad48c4354a0553a82065b2a70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0e96d7-c3", "ovs_interfaceid": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.692 183181 DEBUG nova.network.os_vif_util [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:23:34,bridge_name='br-int',has_traffic_filtering=True,id=ab0e96d7-c389-4db4-8ce6-f189706fb705,network=Network(414e7f05-0834-4508-b4cf-6d4a0b570b9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0e96d7-c3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:37:27 compute-0 nova_compute[183177]: 2026-01-26 19:37:27.693 183181 DEBUG nova.objects.instance [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3316d5bf-2fc3-439d-be93-54696ee605b1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.206 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] End _get_guest_xml xml=<domain type="kvm">
Jan 26 19:37:28 compute-0 nova_compute[183177]:   <uuid>3316d5bf-2fc3-439d-be93-54696ee605b1</uuid>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   <name>instance-00000002</name>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <nova:name>tempest-TestDataModel-server-563859767</nova:name>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:37:27</nova:creationTime>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:37:28 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:37:28 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:user uuid="d5f58385817047fdb78488b13ec067ee">tempest-TestDataModel-933456558-project-admin</nova:user>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:project uuid="f835ebdba84c44c2a95961eb13570992">tempest-TestDataModel-933456558</nova:project>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         <nova:port uuid="ab0e96d7-c389-4db4-8ce6-f189706fb705">
Jan 26 19:37:28 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <system>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <entry name="serial">3316d5bf-2fc3-439d-be93-54696ee605b1</entry>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <entry name="uuid">3316d5bf-2fc3-439d-be93-54696ee605b1</entry>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     </system>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   <os>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   </os>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   <features>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   </features>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk.config"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:30:23:34"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <target dev="tapab0e96d7-c3"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/console.log" append="off"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <video>
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     </video>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:37:28 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:37:28 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:37:28 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:37:28 compute-0 nova_compute[183177]: </domain>
Jan 26 19:37:28 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.208 183181 DEBUG nova.compute.manager [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Preparing to wait for external event network-vif-plugged-ab0e96d7-c389-4db4-8ce6-f189706fb705 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.209 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Acquiring lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.209 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.209 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.210 183181 DEBUG nova.virt.libvirt.vif [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:37:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-563859767',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-563859767',id=2,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f835ebdba84c44c2a95961eb13570992',ramdisk_id='',reservation_id='r-qlqbfw8u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-933456558',owner_user_name='tempest-TestDataModel-933456558-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:37:22Z,user_data=None,user_id='d5f58385817047fdb78488b13ec067ee',uuid=3316d5bf-2fc3-439d-be93-54696ee605b1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "address": "fa:16:3e:30:23:34", "network": {"id": "414e7f05-0834-4508-b4cf-6d4a0b570b9e", "bridge": "br-int", "label": "tempest-TestDataModel-700287017-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a4730ad48c4354a0553a82065b2a70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0e96d7-c3", "ovs_interfaceid": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.211 183181 DEBUG nova.network.os_vif_util [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Converting VIF {"id": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "address": "fa:16:3e:30:23:34", "network": {"id": "414e7f05-0834-4508-b4cf-6d4a0b570b9e", "bridge": "br-int", "label": "tempest-TestDataModel-700287017-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a4730ad48c4354a0553a82065b2a70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0e96d7-c3", "ovs_interfaceid": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.211 183181 DEBUG nova.network.os_vif_util [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:23:34,bridge_name='br-int',has_traffic_filtering=True,id=ab0e96d7-c389-4db4-8ce6-f189706fb705,network=Network(414e7f05-0834-4508-b4cf-6d4a0b570b9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0e96d7-c3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.212 183181 DEBUG os_vif [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:23:34,bridge_name='br-int',has_traffic_filtering=True,id=ab0e96d7-c389-4db4-8ce6-f189706fb705,network=Network(414e7f05-0834-4508-b4cf-6d4a0b570b9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0e96d7-c3') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.212 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.213 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.213 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.214 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.214 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '239c41ac-8324-54f0-bf64-9dde7513631e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.251 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.252 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.256 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.257 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab0e96d7-c3, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.258 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapab0e96d7-c3, col_values=(('qos', UUID('b06ac49e-a38d-47fb-82a5-15788bc89caf')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.258 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapab0e96d7-c3, col_values=(('external_ids', {'iface-id': 'ab0e96d7-c389-4db4-8ce6-f189706fb705', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:23:34', 'vm-uuid': '3316d5bf-2fc3-439d-be93-54696ee605b1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.260 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:28 compute-0 NetworkManager[55489]: <info>  [1769456248.2635] manager: (tapab0e96d7-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.263 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.269 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.269 183181 INFO os_vif [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:23:34,bridge_name='br-int',has_traffic_filtering=True,id=ab0e96d7-c389-4db4-8ce6-f189706fb705,network=Network(414e7f05-0834-4508-b4cf-6d4a0b570b9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0e96d7-c3')
Jan 26 19:37:28 compute-0 nova_compute[183177]: 2026-01-26 19:37:28.646 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:29 compute-0 podman[192499]: time="2026-01-26T19:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:37:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:37:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2162 "" "Go-http-client/1.1"
Jan 26 19:37:29 compute-0 nova_compute[183177]: 2026-01-26 19:37:29.827 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:37:29 compute-0 nova_compute[183177]: 2026-01-26 19:37:29.827 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:37:29 compute-0 nova_compute[183177]: 2026-01-26 19:37:29.828 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] No VIF found with MAC fa:16:3e:30:23:34, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 19:37:29 compute-0 nova_compute[183177]: 2026-01-26 19:37:29.828 183181 INFO nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Using config drive
Jan 26 19:37:30 compute-0 nova_compute[183177]: 2026-01-26 19:37:30.340 183181 WARNING neutronclient.v2_0.client [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:37:30 compute-0 nova_compute[183177]: 2026-01-26 19:37:30.596 183181 INFO nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Creating config drive at /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk.config
Jan 26 19:37:30 compute-0 nova_compute[183177]: 2026-01-26 19:37:30.610 183181 DEBUG oslo_concurrency.processutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpsaf3dkaq execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:37:30 compute-0 nova_compute[183177]: 2026-01-26 19:37:30.754 183181 DEBUG oslo_concurrency.processutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpsaf3dkaq" returned: 0 in 0.145s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:37:30 compute-0 kernel: tapab0e96d7-c3: entered promiscuous mode
Jan 26 19:37:30 compute-0 NetworkManager[55489]: <info>  [1769456250.8458] manager: (tapab0e96d7-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Jan 26 19:37:30 compute-0 nova_compute[183177]: 2026-01-26 19:37:30.898 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:30 compute-0 ovn_controller[95396]: 2026-01-26T19:37:30Z|00049|binding|INFO|Claiming lport ab0e96d7-c389-4db4-8ce6-f189706fb705 for this chassis.
Jan 26 19:37:30 compute-0 ovn_controller[95396]: 2026-01-26T19:37:30Z|00050|binding|INFO|ab0e96d7-c389-4db4-8ce6-f189706fb705: Claiming fa:16:3e:30:23:34 10.100.0.13
Jan 26 19:37:30 compute-0 nova_compute[183177]: 2026-01-26 19:37:30.905 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:30.921 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:23:34 10.100.0.13'], port_security=['fa:16:3e:30:23:34 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3316d5bf-2fc3-439d-be93-54696ee605b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-414e7f05-0834-4508-b4cf-6d4a0b570b9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f835ebdba84c44c2a95961eb13570992', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64bb9ee9-4ffc-4876-8906-ddc25248737e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d064364a-4825-4a41-ba3d-524a7bff4047, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=ab0e96d7-c389-4db4-8ce6-f189706fb705) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:37:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:30.924 104672 INFO neutron.agent.ovn.metadata.agent [-] Port ab0e96d7-c389-4db4-8ce6-f189706fb705 in datapath 414e7f05-0834-4508-b4cf-6d4a0b570b9e bound to our chassis
Jan 26 19:37:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:30.926 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 414e7f05-0834-4508-b4cf-6d4a0b570b9e
Jan 26 19:37:30 compute-0 systemd-machined[154465]: New machine qemu-2-instance-00000002.
Jan 26 19:37:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:30.940 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[54595b36-5660-4caf-9d61-e1aaefe021a6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:30.943 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap414e7f05-01 in ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 19:37:30 compute-0 systemd-udevd[205247]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:37:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:30.949 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap414e7f05-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 19:37:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:30.949 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b25dea24-49fd-4ac1-91d0-cd5a8e347759]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:30.950 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9b46c61a-f3ee-45d1-b48b-47b07b5029d8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:30 compute-0 NetworkManager[55489]: <info>  [1769456250.9622] device (tapab0e96d7-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:37:30 compute-0 ovn_controller[95396]: 2026-01-26T19:37:30Z|00051|binding|INFO|Setting lport ab0e96d7-c389-4db4-8ce6-f189706fb705 ovn-installed in OVS
Jan 26 19:37:30 compute-0 ovn_controller[95396]: 2026-01-26T19:37:30Z|00052|binding|INFO|Setting lport ab0e96d7-c389-4db4-8ce6-f189706fb705 up in Southbound
Jan 26 19:37:30 compute-0 NetworkManager[55489]: <info>  [1769456250.9651] device (tapab0e96d7-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:37:30 compute-0 nova_compute[183177]: 2026-01-26 19:37:30.964 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:30 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Jan 26 19:37:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:30.975 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c6dbe4-dd32-4d5f-8fd8-f636a914cde0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:30.999 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[c2b36330-f4c9-40e9-97f9-80d2f7abb54c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.041 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[34783593-93ca-47a3-b7c1-e067e50de4c8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.046 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6aa3af2f-8f80-4eb3-9914-a1d7b773dab0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 NetworkManager[55489]: <info>  [1769456251.0477] manager: (tap414e7f05-00): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.096 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[d568738c-3f01-4212-9def-23750c4f5fd5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.101 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[89badd64-fbf0-47b1-98a2-56405a269e65]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 NetworkManager[55489]: <info>  [1769456251.1304] device (tap414e7f05-00): carrier: link connected
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.139 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[c55453ae-3811-404f-8b31-b335eb38b906]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.159 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9d73c3ff-ca96-4ede-96ea-b4c9c881be09]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap414e7f05-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:a8:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371001, 'reachable_time': 16777, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 205279, 'error': None, 'target': 'ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.179 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[dbfd8161-3893-4bff-95b9-291e5fd32780]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:a8a0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371001, 'tstamp': 371001}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 205280, 'error': None, 'target': 'ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.198 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1cdc32d0-4c8a-4a5c-9f1d-9ab83dcea93f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap414e7f05-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:a8:a0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371001, 'reachable_time': 16777, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 205281, 'error': None, 'target': 'ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.236 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e9b71ac9-de52-4ff5-b9bd-05a8ae477e23]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.325 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[13611415-cbcb-4df0-a0dc-e84bad7f2900]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.327 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap414e7f05-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.327 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.327 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap414e7f05-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.329 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:31 compute-0 kernel: tap414e7f05-00: entered promiscuous mode
Jan 26 19:37:31 compute-0 NetworkManager[55489]: <info>  [1769456251.3311] manager: (tap414e7f05-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.333 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.334 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap414e7f05-00, col_values=(('external_ids', {'iface-id': '56efd16c-bb3e-4b64-8c74-e0372eb8bd16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.334 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:31 compute-0 ovn_controller[95396]: 2026-01-26T19:37:31Z|00053|binding|INFO|Releasing lport 56efd16c-bb3e-4b64-8c74-e0372eb8bd16 from this chassis (sb_readonly=0)
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.335 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.337 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9a6df92e-88e7-4be2-8c3a-14f57b54ebb6]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.338 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/414e7f05-0834-4508-b4cf-6d4a0b570b9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/414e7f05-0834-4508-b4cf-6d4a0b570b9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.338 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/414e7f05-0834-4508-b4cf-6d4a0b570b9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/414e7f05-0834-4508-b4cf-6d4a0b570b9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.339 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 414e7f05-0834-4508-b4cf-6d4a0b570b9e disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.339 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/414e7f05-0834-4508-b4cf-6d4a0b570b9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/414e7f05-0834-4508-b4cf-6d4a0b570b9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.339 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[98a09bf5-e434-4329-a93c-69f1969c8f47]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.340 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/414e7f05-0834-4508-b4cf-6d4a0b570b9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/414e7f05-0834-4508-b4cf-6d4a0b570b9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.340 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3e615f18-38b9-4e79-96bd-1743a23dcb57]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.341 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: global
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-414e7f05-0834-4508-b4cf-6d4a0b570b9e
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/414e7f05-0834-4508-b4cf-6d4a0b570b9e.pid.haproxy
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID 414e7f05-0834-4508-b4cf-6d4a0b570b9e
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 19:37:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:31.343 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e', 'env', 'PROCESS_TAG=haproxy-414e7f05-0834-4508-b4cf-6d4a0b570b9e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/414e7f05-0834-4508-b4cf-6d4a0b570b9e.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.346 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:31 compute-0 openstack_network_exporter[195363]: ERROR   19:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:37:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:37:31 compute-0 openstack_network_exporter[195363]: ERROR   19:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:37:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.813 183181 DEBUG nova.compute.manager [req-d56d8dfb-a3e8-48fa-89a7-197ce4539228 req-827be4fb-6d5e-411b-984b-0be38d89f839 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Received event network-vif-plugged-ab0e96d7-c389-4db4-8ce6-f189706fb705 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.813 183181 DEBUG oslo_concurrency.lockutils [req-d56d8dfb-a3e8-48fa-89a7-197ce4539228 req-827be4fb-6d5e-411b-984b-0be38d89f839 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.814 183181 DEBUG oslo_concurrency.lockutils [req-d56d8dfb-a3e8-48fa-89a7-197ce4539228 req-827be4fb-6d5e-411b-984b-0be38d89f839 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.814 183181 DEBUG oslo_concurrency.lockutils [req-d56d8dfb-a3e8-48fa-89a7-197ce4539228 req-827be4fb-6d5e-411b-984b-0be38d89f839 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.815 183181 DEBUG nova.compute.manager [req-d56d8dfb-a3e8-48fa-89a7-197ce4539228 req-827be4fb-6d5e-411b-984b-0be38d89f839 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Processing event network-vif-plugged-ab0e96d7-c389-4db4-8ce6-f189706fb705 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.816 183181 DEBUG nova.compute.manager [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.822 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.828 183181 INFO nova.virt.libvirt.driver [-] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Instance spawned successfully.
Jan 26 19:37:31 compute-0 nova_compute[183177]: 2026-01-26 19:37:31.828 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 19:37:31 compute-0 podman[205320]: 2026-01-26 19:37:31.871974937 +0000 UTC m=+0.095638043 container create 4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260120)
Jan 26 19:37:31 compute-0 podman[205320]: 2026-01-26 19:37:31.825928633 +0000 UTC m=+0.049591839 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 19:37:31 compute-0 systemd[1]: Started libpod-conmon-4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c.scope.
Jan 26 19:37:31 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:37:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/deb573d4cc20105a5bf2b60c05d9e5a33e338928064dcdd4f7fe1c5ed445121f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 19:37:31 compute-0 podman[205320]: 2026-01-26 19:37:31.994799186 +0000 UTC m=+0.218462312 container init 4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 19:37:32 compute-0 podman[205320]: 2026-01-26 19:37:32.006170805 +0000 UTC m=+0.229833911 container start 4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120)
Jan 26 19:37:32 compute-0 neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e[205335]: [NOTICE]   (205339) : New worker (205341) forked
Jan 26 19:37:32 compute-0 neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e[205335]: [NOTICE]   (205339) : Loading success.
Jan 26 19:37:32 compute-0 nova_compute[183177]: 2026-01-26 19:37:32.343 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:37:32 compute-0 nova_compute[183177]: 2026-01-26 19:37:32.343 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:37:32 compute-0 nova_compute[183177]: 2026-01-26 19:37:32.344 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:37:32 compute-0 nova_compute[183177]: 2026-01-26 19:37:32.344 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:37:32 compute-0 nova_compute[183177]: 2026-01-26 19:37:32.344 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:37:32 compute-0 nova_compute[183177]: 2026-01-26 19:37:32.345 183181 DEBUG nova.virt.libvirt.driver [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:37:33 compute-0 nova_compute[183177]: 2026-01-26 19:37:33.304 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:33 compute-0 nova_compute[183177]: 2026-01-26 19:37:33.371 183181 INFO nova.compute.manager [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Took 9.58 seconds to spawn the instance on the hypervisor.
Jan 26 19:37:33 compute-0 nova_compute[183177]: 2026-01-26 19:37:33.372 183181 DEBUG nova.compute.manager [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 19:37:33 compute-0 nova_compute[183177]: 2026-01-26 19:37:33.650 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:33 compute-0 nova_compute[183177]: 2026-01-26 19:37:33.874 183181 DEBUG nova.compute.manager [req-61f95dfb-deef-4aad-8038-0f0e9c32652a req-9d748dae-6d97-45d1-8676-ed1f070cd0fe 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Received event network-vif-plugged-ab0e96d7-c389-4db4-8ce6-f189706fb705 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:37:33 compute-0 nova_compute[183177]: 2026-01-26 19:37:33.875 183181 DEBUG oslo_concurrency.lockutils [req-61f95dfb-deef-4aad-8038-0f0e9c32652a req-9d748dae-6d97-45d1-8676-ed1f070cd0fe 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:37:33 compute-0 nova_compute[183177]: 2026-01-26 19:37:33.875 183181 DEBUG oslo_concurrency.lockutils [req-61f95dfb-deef-4aad-8038-0f0e9c32652a req-9d748dae-6d97-45d1-8676-ed1f070cd0fe 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:37:33 compute-0 nova_compute[183177]: 2026-01-26 19:37:33.875 183181 DEBUG oslo_concurrency.lockutils [req-61f95dfb-deef-4aad-8038-0f0e9c32652a req-9d748dae-6d97-45d1-8676-ed1f070cd0fe 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:37:33 compute-0 nova_compute[183177]: 2026-01-26 19:37:33.875 183181 DEBUG nova.compute.manager [req-61f95dfb-deef-4aad-8038-0f0e9c32652a req-9d748dae-6d97-45d1-8676-ed1f070cd0fe 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] No waiting events found dispatching network-vif-plugged-ab0e96d7-c389-4db4-8ce6-f189706fb705 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:37:33 compute-0 nova_compute[183177]: 2026-01-26 19:37:33.875 183181 WARNING nova.compute.manager [req-61f95dfb-deef-4aad-8038-0f0e9c32652a req-9d748dae-6d97-45d1-8676-ed1f070cd0fe 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Received unexpected event network-vif-plugged-ab0e96d7-c389-4db4-8ce6-f189706fb705 for instance with vm_state active and task_state None.
Jan 26 19:37:33 compute-0 nova_compute[183177]: 2026-01-26 19:37:33.903 183181 INFO nova.compute.manager [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Took 14.83 seconds to build instance.
Jan 26 19:37:34 compute-0 nova_compute[183177]: 2026-01-26 19:37:34.407 183181 DEBUG oslo_concurrency.lockutils [None req-3c4dcdf8-ed38-4d0a-b851-5c93a17e07e6 d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.348s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:37:37 compute-0 nova_compute[183177]: 2026-01-26 19:37:37.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:37:37 compute-0 nova_compute[183177]: 2026-01-26 19:37:37.155 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 19:37:37 compute-0 nova_compute[183177]: 2026-01-26 19:37:37.662 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 19:37:38 compute-0 nova_compute[183177]: 2026-01-26 19:37:38.349 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:38 compute-0 nova_compute[183177]: 2026-01-26 19:37:38.654 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:40 compute-0 nova_compute[183177]: 2026-01-26 19:37:40.662 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:37:41 compute-0 nova_compute[183177]: 2026-01-26 19:37:41.174 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:37:41 compute-0 nova_compute[183177]: 2026-01-26 19:37:41.175 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:37:41 compute-0 nova_compute[183177]: 2026-01-26 19:37:41.175 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:37:41 compute-0 nova_compute[183177]: 2026-01-26 19:37:41.175 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:37:42 compute-0 nova_compute[183177]: 2026-01-26 19:37:42.231 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:37:42 compute-0 nova_compute[183177]: 2026-01-26 19:37:42.296 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:37:42 compute-0 nova_compute[183177]: 2026-01-26 19:37:42.297 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:37:42 compute-0 nova_compute[183177]: 2026-01-26 19:37:42.385 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:37:42 compute-0 nova_compute[183177]: 2026-01-26 19:37:42.583 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:37:42 compute-0 nova_compute[183177]: 2026-01-26 19:37:42.586 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:37:42 compute-0 nova_compute[183177]: 2026-01-26 19:37:42.615 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:37:42 compute-0 nova_compute[183177]: 2026-01-26 19:37:42.616 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5652MB free_disk=73.10248565673828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:37:42 compute-0 nova_compute[183177]: 2026-01-26 19:37:42.617 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:37:42 compute-0 nova_compute[183177]: 2026-01-26 19:37:42.618 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:37:43 compute-0 nova_compute[183177]: 2026-01-26 19:37:43.399 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:43 compute-0 nova_compute[183177]: 2026-01-26 19:37:43.654 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:43 compute-0 nova_compute[183177]: 2026-01-26 19:37:43.669 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 3316d5bf-2fc3-439d-be93-54696ee605b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:37:43 compute-0 nova_compute[183177]: 2026-01-26 19:37:43.670 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:37:43 compute-0 nova_compute[183177]: 2026-01-26 19:37:43.670 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:37:42 up  1:02,  0 user,  load average: 0.44, 0.38, 0.52\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_f835ebdba84c44c2a95961eb13570992': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:37:43 compute-0 nova_compute[183177]: 2026-01-26 19:37:43.716 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:37:44 compute-0 nova_compute[183177]: 2026-01-26 19:37:44.226 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:37:44 compute-0 podman[205366]: 2026-01-26 19:37:44.387289084 +0000 UTC m=+0.128242632 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS)
Jan 26 19:37:44 compute-0 nova_compute[183177]: 2026-01-26 19:37:44.737 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:37:44 compute-0 nova_compute[183177]: 2026-01-26 19:37:44.739 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.121s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:37:44 compute-0 nova_compute[183177]: 2026-01-26 19:37:44.739 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:37:44 compute-0 ovn_controller[95396]: 2026-01-26T19:37:44Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:30:23:34 10.100.0.13
Jan 26 19:37:44 compute-0 ovn_controller[95396]: 2026-01-26T19:37:44Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:30:23:34 10.100.0.13
Jan 26 19:37:46 compute-0 nova_compute[183177]: 2026-01-26 19:37:46.690 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:46.692 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:37:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:46.694 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:37:46 compute-0 nova_compute[183177]: 2026-01-26 19:37:46.735 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:37:46 compute-0 nova_compute[183177]: 2026-01-26 19:37:46.736 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:37:46 compute-0 nova_compute[183177]: 2026-01-26 19:37:46.736 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:37:46 compute-0 nova_compute[183177]: 2026-01-26 19:37:46.736 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:37:46 compute-0 nova_compute[183177]: 2026-01-26 19:37:46.737 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:37:46 compute-0 nova_compute[183177]: 2026-01-26 19:37:46.737 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:37:46 compute-0 nova_compute[183177]: 2026-01-26 19:37:46.737 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:37:47 compute-0 nova_compute[183177]: 2026-01-26 19:37:47.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:37:47 compute-0 podman[205393]: 2026-01-26 19:37:47.359821342 +0000 UTC m=+0.097155242 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Jan 26 19:37:47 compute-0 podman[205394]: 2026-01-26 19:37:47.364853175 +0000 UTC m=+0.094240086 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120)
Jan 26 19:37:47 compute-0 nova_compute[183177]: 2026-01-26 19:37:47.659 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:37:47 compute-0 nova_compute[183177]: 2026-01-26 19:37:47.659 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:37:47 compute-0 nova_compute[183177]: 2026-01-26 19:37:47.660 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 19:37:48 compute-0 nova_compute[183177]: 2026-01-26 19:37:48.404 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:48 compute-0 nova_compute[183177]: 2026-01-26 19:37:48.657 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:37:50.695 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:37:53 compute-0 nova_compute[183177]: 2026-01-26 19:37:53.450 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:53 compute-0 nova_compute[183177]: 2026-01-26 19:37:53.661 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:56 compute-0 podman[205433]: 2026-01-26 19:37:56.371914581 +0000 UTC m=+0.116153704 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 19:37:58 compute-0 nova_compute[183177]: 2026-01-26 19:37:58.501 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:58 compute-0 nova_compute[183177]: 2026-01-26 19:37:58.666 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:37:59 compute-0 sshd-session[205459]: banner exchange: Connection from 45.227.254.156 port 65366: invalid format
Jan 26 19:37:59 compute-0 podman[192499]: time="2026-01-26T19:37:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:37:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:37:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:37:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:37:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2631 "" "Go-http-client/1.1"
Jan 26 19:38:01 compute-0 openstack_network_exporter[195363]: ERROR   19:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:38:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:38:01 compute-0 openstack_network_exporter[195363]: ERROR   19:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:38:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:38:03 compute-0 nova_compute[183177]: 2026-01-26 19:38:03.546 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:03 compute-0 nova_compute[183177]: 2026-01-26 19:38:03.668 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:08 compute-0 nova_compute[183177]: 2026-01-26 19:38:08.597 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:08 compute-0 nova_compute[183177]: 2026-01-26 19:38:08.671 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:12 compute-0 nova_compute[183177]: 2026-01-26 19:38:12.842 183181 DEBUG oslo_concurrency.lockutils [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Acquiring lock "3316d5bf-2fc3-439d-be93-54696ee605b1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:38:12 compute-0 nova_compute[183177]: 2026-01-26 19:38:12.844 183181 DEBUG oslo_concurrency.lockutils [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:38:12 compute-0 nova_compute[183177]: 2026-01-26 19:38:12.844 183181 DEBUG oslo_concurrency.lockutils [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Acquiring lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:38:12 compute-0 nova_compute[183177]: 2026-01-26 19:38:12.845 183181 DEBUG oslo_concurrency.lockutils [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:38:12 compute-0 nova_compute[183177]: 2026-01-26 19:38:12.845 183181 DEBUG oslo_concurrency.lockutils [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:38:12 compute-0 nova_compute[183177]: 2026-01-26 19:38:12.879 183181 INFO nova.compute.manager [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Terminating instance
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.396 183181 DEBUG nova.compute.manager [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 19:38:13 compute-0 kernel: tapab0e96d7-c3 (unregistering): left promiscuous mode
Jan 26 19:38:13 compute-0 NetworkManager[55489]: <info>  [1769456293.4267] device (tapab0e96d7-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.468 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:13 compute-0 ovn_controller[95396]: 2026-01-26T19:38:13Z|00054|binding|INFO|Releasing lport ab0e96d7-c389-4db4-8ce6-f189706fb705 from this chassis (sb_readonly=0)
Jan 26 19:38:13 compute-0 ovn_controller[95396]: 2026-01-26T19:38:13Z|00055|binding|INFO|Setting lport ab0e96d7-c389-4db4-8ce6-f189706fb705 down in Southbound
Jan 26 19:38:13 compute-0 ovn_controller[95396]: 2026-01-26T19:38:13Z|00056|binding|INFO|Removing iface tapab0e96d7-c3 ovn-installed in OVS
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.472 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.482 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:23:34 10.100.0.13'], port_security=['fa:16:3e:30:23:34 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3316d5bf-2fc3-439d-be93-54696ee605b1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-414e7f05-0834-4508-b4cf-6d4a0b570b9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f835ebdba84c44c2a95961eb13570992', 'neutron:revision_number': '5', 'neutron:security_group_ids': '64bb9ee9-4ffc-4876-8906-ddc25248737e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d064364a-4825-4a41-ba3d-524a7bff4047, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=ab0e96d7-c389-4db4-8ce6-f189706fb705) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.484 104672 INFO neutron.agent.ovn.metadata.agent [-] Port ab0e96d7-c389-4db4-8ce6-f189706fb705 in datapath 414e7f05-0834-4508-b4cf-6d4a0b570b9e unbound from our chassis
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.485 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 414e7f05-0834-4508-b4cf-6d4a0b570b9e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.487 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[27adc5b6-0233-4765-9911-bac5196fb70e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.488 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e namespace which is not needed anymore
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.496 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:13 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Jan 26 19:38:13 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 13.946s CPU time.
Jan 26 19:38:13 compute-0 systemd-machined[154465]: Machine qemu-2-instance-00000002 terminated.
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.600 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.624 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.631 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:13 compute-0 neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e[205335]: [NOTICE]   (205339) : haproxy version is 3.0.5-8e879a5
Jan 26 19:38:13 compute-0 neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e[205335]: [NOTICE]   (205339) : path to executable is /usr/sbin/haproxy
Jan 26 19:38:13 compute-0 neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e[205335]: [WARNING]  (205339) : Exiting Master process...
Jan 26 19:38:13 compute-0 podman[205486]: 2026-01-26 19:38:13.664732693 +0000 UTC m=+0.044773602 container kill 4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:38:13 compute-0 neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e[205335]: [ALERT]    (205339) : Current worker (205341) exited with code 143 (Terminated)
Jan 26 19:38:13 compute-0 neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e[205335]: [WARNING]  (205339) : All workers exited. Exiting... (0)
Jan 26 19:38:13 compute-0 systemd[1]: libpod-4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c.scope: Deactivated successfully.
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.672 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.679 183181 INFO nova.virt.libvirt.driver [-] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Instance destroyed successfully.
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.680 183181 DEBUG nova.objects.instance [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lazy-loading 'resources' on Instance uuid 3316d5bf-2fc3-439d-be93-54696ee605b1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.711 183181 DEBUG nova.compute.manager [req-bac7fca7-774b-492a-8bdd-1175cb612b15 req-dbbdeca8-ec64-4da2-bb1a-31edb0bc67be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Received event network-vif-unplugged-ab0e96d7-c389-4db4-8ce6-f189706fb705 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.712 183181 DEBUG oslo_concurrency.lockutils [req-bac7fca7-774b-492a-8bdd-1175cb612b15 req-dbbdeca8-ec64-4da2-bb1a-31edb0bc67be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.712 183181 DEBUG oslo_concurrency.lockutils [req-bac7fca7-774b-492a-8bdd-1175cb612b15 req-dbbdeca8-ec64-4da2-bb1a-31edb0bc67be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.713 183181 DEBUG oslo_concurrency.lockutils [req-bac7fca7-774b-492a-8bdd-1175cb612b15 req-dbbdeca8-ec64-4da2-bb1a-31edb0bc67be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.713 183181 DEBUG nova.compute.manager [req-bac7fca7-774b-492a-8bdd-1175cb612b15 req-dbbdeca8-ec64-4da2-bb1a-31edb0bc67be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] No waiting events found dispatching network-vif-unplugged-ab0e96d7-c389-4db4-8ce6-f189706fb705 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.714 183181 DEBUG nova.compute.manager [req-bac7fca7-774b-492a-8bdd-1175cb612b15 req-dbbdeca8-ec64-4da2-bb1a-31edb0bc67be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Received event network-vif-unplugged-ab0e96d7-c389-4db4-8ce6-f189706fb705 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:38:13 compute-0 podman[205518]: 2026-01-26 19:38:13.725983057 +0000 UTC m=+0.035132357 container died 4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:38:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c-userdata-shm.mount: Deactivated successfully.
Jan 26 19:38:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-deb573d4cc20105a5bf2b60c05d9e5a33e338928064dcdd4f7fe1c5ed445121f-merged.mount: Deactivated successfully.
Jan 26 19:38:13 compute-0 podman[205518]: 2026-01-26 19:38:13.759348487 +0000 UTC m=+0.068497757 container cleanup 4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Jan 26 19:38:13 compute-0 systemd[1]: libpod-conmon-4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c.scope: Deactivated successfully.
Jan 26 19:38:13 compute-0 podman[205520]: 2026-01-26 19:38:13.776247592 +0000 UTC m=+0.064435640 container remove 4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.785 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[275fa3ef-802b-4585-b499-2c0ced7ffe6b]: (4, ("Mon Jan 26 07:38:13 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e (4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c)\n4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c\nMon Jan 26 07:38:13 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e (4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c)\n4bfe12d35cefdcc962e95d8151048c4ddaf7a3dae8f778139575c277a0fa903c\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.787 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[49e1d4fb-c7fe-4195-ae4e-855f21f7c59c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.787 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/414e7f05-0834-4508-b4cf-6d4a0b570b9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/414e7f05-0834-4508-b4cf-6d4a0b570b9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.788 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[534e0aee-2aa5-426e-983f-9acae62e12ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.788 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap414e7f05-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.791 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:13 compute-0 kernel: tap414e7f05-00: left promiscuous mode
Jan 26 19:38:13 compute-0 nova_compute[183177]: 2026-01-26 19:38:13.820 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.824 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a8bdf56e-0ab4-47a6-a391-9f3d05e6d8ab]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.839 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[76360516-501b-4091-8774-d2b228099734]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.841 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[fa997991-0054-478b-bd97-7fa3abc404a6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.865 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3b1e9493-e14c-475f-9c05-48a9256a1f44]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 370992, 'reachable_time': 41737, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 205552, 'error': None, 'target': 'ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.869 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-414e7f05-0834-4508-b4cf-6d4a0b570b9e deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 19:38:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:13.869 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[44daa962-f31d-4f59-93cb-8596af8b4dc8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:38:13 compute-0 systemd[1]: run-netns-ovnmeta\x2d414e7f05\x2d0834\x2d4508\x2db4cf\x2d6d4a0b570b9e.mount: Deactivated successfully.
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.188 183181 DEBUG nova.virt.libvirt.vif [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:37:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-563859767',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-563859767',id=2,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:37:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f835ebdba84c44c2a95961eb13570992',ramdisk_id='',reservation_id='r-qlqbfw8u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestDataModel-933456558',owner_user_name='tempest-TestDataModel-933456558-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:37:33Z,user_data=None,user_id='d5f58385817047fdb78488b13ec067ee',uuid=3316d5bf-2fc3-439d-be93-54696ee605b1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "address": "fa:16:3e:30:23:34", "network": {"id": "414e7f05-0834-4508-b4cf-6d4a0b570b9e", "bridge": "br-int", "label": "tempest-TestDataModel-700287017-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a4730ad48c4354a0553a82065b2a70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0e96d7-c3", "ovs_interfaceid": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.189 183181 DEBUG nova.network.os_vif_util [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Converting VIF {"id": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "address": "fa:16:3e:30:23:34", "network": {"id": "414e7f05-0834-4508-b4cf-6d4a0b570b9e", "bridge": "br-int", "label": "tempest-TestDataModel-700287017-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "25a4730ad48c4354a0553a82065b2a70", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0e96d7-c3", "ovs_interfaceid": "ab0e96d7-c389-4db4-8ce6-f189706fb705", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.189 183181 DEBUG nova.network.os_vif_util [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:23:34,bridge_name='br-int',has_traffic_filtering=True,id=ab0e96d7-c389-4db4-8ce6-f189706fb705,network=Network(414e7f05-0834-4508-b4cf-6d4a0b570b9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0e96d7-c3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.190 183181 DEBUG os_vif [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:23:34,bridge_name='br-int',has_traffic_filtering=True,id=ab0e96d7-c389-4db4-8ce6-f189706fb705,network=Network(414e7f05-0834-4508-b4cf-6d4a0b570b9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0e96d7-c3') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.192 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.192 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab0e96d7-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.193 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.195 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.196 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.197 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b06ac49e-a38d-47fb-82a5-15788bc89caf) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.198 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.199 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.201 183181 INFO os_vif [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:23:34,bridge_name='br-int',has_traffic_filtering=True,id=ab0e96d7-c389-4db4-8ce6-f189706fb705,network=Network(414e7f05-0834-4508-b4cf-6d4a0b570b9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0e96d7-c3')
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.202 183181 INFO nova.virt.libvirt.driver [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Deleting instance files /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1_del
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.203 183181 INFO nova.virt.libvirt.driver [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Deletion of /var/lib/nova/instances/3316d5bf-2fc3-439d-be93-54696ee605b1_del complete
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.718 183181 INFO nova.compute.manager [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Took 1.32 seconds to destroy the instance on the hypervisor.
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.718 183181 DEBUG oslo.service.backend._eventlet.loopingcall [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.719 183181 DEBUG nova.compute.manager [-] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.719 183181 DEBUG nova.network.neutron [-] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 19:38:14 compute-0 nova_compute[183177]: 2026-01-26 19:38:14.720 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:38:15 compute-0 nova_compute[183177]: 2026-01-26 19:38:15.344 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:38:15 compute-0 podman[205553]: 2026-01-26 19:38:15.372172333 +0000 UTC m=+0.121225137 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:38:15 compute-0 nova_compute[183177]: 2026-01-26 19:38:15.792 183181 DEBUG nova.compute.manager [req-c8666342-71eb-4b04-94e7-09206c4f0241 req-6368fb4e-a9af-4925-9553-749f857229e7 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Received event network-vif-unplugged-ab0e96d7-c389-4db4-8ce6-f189706fb705 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:38:15 compute-0 nova_compute[183177]: 2026-01-26 19:38:15.792 183181 DEBUG oslo_concurrency.lockutils [req-c8666342-71eb-4b04-94e7-09206c4f0241 req-6368fb4e-a9af-4925-9553-749f857229e7 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:38:15 compute-0 nova_compute[183177]: 2026-01-26 19:38:15.793 183181 DEBUG oslo_concurrency.lockutils [req-c8666342-71eb-4b04-94e7-09206c4f0241 req-6368fb4e-a9af-4925-9553-749f857229e7 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:38:15 compute-0 nova_compute[183177]: 2026-01-26 19:38:15.793 183181 DEBUG oslo_concurrency.lockutils [req-c8666342-71eb-4b04-94e7-09206c4f0241 req-6368fb4e-a9af-4925-9553-749f857229e7 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:38:15 compute-0 nova_compute[183177]: 2026-01-26 19:38:15.793 183181 DEBUG nova.compute.manager [req-c8666342-71eb-4b04-94e7-09206c4f0241 req-6368fb4e-a9af-4925-9553-749f857229e7 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] No waiting events found dispatching network-vif-unplugged-ab0e96d7-c389-4db4-8ce6-f189706fb705 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:38:15 compute-0 nova_compute[183177]: 2026-01-26 19:38:15.793 183181 DEBUG nova.compute.manager [req-c8666342-71eb-4b04-94e7-09206c4f0241 req-6368fb4e-a9af-4925-9553-749f857229e7 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Received event network-vif-unplugged-ab0e96d7-c389-4db4-8ce6-f189706fb705 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:38:16 compute-0 nova_compute[183177]: 2026-01-26 19:38:16.596 183181 DEBUG nova.compute.manager [req-59924512-b381-47c8-8e03-dc62f21a8481 req-8b853694-7809-4fd1-8756-8c9bf5810806 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Received event network-vif-deleted-ab0e96d7-c389-4db4-8ce6-f189706fb705 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:38:16 compute-0 nova_compute[183177]: 2026-01-26 19:38:16.596 183181 INFO nova.compute.manager [req-59924512-b381-47c8-8e03-dc62f21a8481 req-8b853694-7809-4fd1-8756-8c9bf5810806 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Neutron deleted interface ab0e96d7-c389-4db4-8ce6-f189706fb705; detaching it from the instance and deleting it from the info cache
Jan 26 19:38:16 compute-0 nova_compute[183177]: 2026-01-26 19:38:16.597 183181 DEBUG nova.network.neutron [req-59924512-b381-47c8-8e03-dc62f21a8481 req-8b853694-7809-4fd1-8756-8c9bf5810806 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:38:16 compute-0 nova_compute[183177]: 2026-01-26 19:38:16.921 183181 DEBUG nova.network.neutron [-] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:38:17 compute-0 nova_compute[183177]: 2026-01-26 19:38:17.104 183181 DEBUG nova.compute.manager [req-59924512-b381-47c8-8e03-dc62f21a8481 req-8b853694-7809-4fd1-8756-8c9bf5810806 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Detach interface failed, port_id=ab0e96d7-c389-4db4-8ce6-f189706fb705, reason: Instance 3316d5bf-2fc3-439d-be93-54696ee605b1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 19:38:17 compute-0 nova_compute[183177]: 2026-01-26 19:38:17.433 183181 INFO nova.compute.manager [-] [instance: 3316d5bf-2fc3-439d-be93-54696ee605b1] Took 2.71 seconds to deallocate network for instance.
Jan 26 19:38:17 compute-0 nova_compute[183177]: 2026-01-26 19:38:17.958 183181 DEBUG oslo_concurrency.lockutils [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:38:17 compute-0 nova_compute[183177]: 2026-01-26 19:38:17.959 183181 DEBUG oslo_concurrency.lockutils [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:38:18 compute-0 nova_compute[183177]: 2026-01-26 19:38:18.017 183181 DEBUG nova.compute.provider_tree [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:38:18 compute-0 podman[205584]: 2026-01-26 19:38:18.346290893 +0000 UTC m=+0.090780374 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Jan 26 19:38:18 compute-0 podman[205583]: 2026-01-26 19:38:18.347674519 +0000 UTC m=+0.090590239 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350)
Jan 26 19:38:18 compute-0 nova_compute[183177]: 2026-01-26 19:38:18.525 183181 DEBUG nova.scheduler.client.report [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:38:18 compute-0 nova_compute[183177]: 2026-01-26 19:38:18.723 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:19 compute-0 nova_compute[183177]: 2026-01-26 19:38:19.040 183181 DEBUG oslo_concurrency.lockutils [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.081s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:38:19 compute-0 nova_compute[183177]: 2026-01-26 19:38:19.072 183181 INFO nova.scheduler.client.report [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Deleted allocations for instance 3316d5bf-2fc3-439d-be93-54696ee605b1
Jan 26 19:38:19 compute-0 nova_compute[183177]: 2026-01-26 19:38:19.199 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:20 compute-0 nova_compute[183177]: 2026-01-26 19:38:20.104 183181 DEBUG oslo_concurrency.lockutils [None req-ed765a59-86fc-4982-9939-61f0cb04045c d5f58385817047fdb78488b13ec067ee f835ebdba84c44c2a95961eb13570992 - - default default] Lock "3316d5bf-2fc3-439d-be93-54696ee605b1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.261s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:38:23 compute-0 nova_compute[183177]: 2026-01-26 19:38:23.726 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:24.031 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:38:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:24.032 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:38:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:24.032 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:38:24 compute-0 nova_compute[183177]: 2026-01-26 19:38:24.201 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:27 compute-0 podman[205621]: 2026-01-26 19:38:27.30308291 +0000 UTC m=+0.059696975 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 19:38:27 compute-0 nova_compute[183177]: 2026-01-26 19:38:27.992 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:28 compute-0 nova_compute[183177]: 2026-01-26 19:38:28.729 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:29 compute-0 nova_compute[183177]: 2026-01-26 19:38:29.203 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:29 compute-0 podman[192499]: time="2026-01-26T19:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:38:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:38:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 26 19:38:31 compute-0 openstack_network_exporter[195363]: ERROR   19:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:38:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:38:31 compute-0 openstack_network_exporter[195363]: ERROR   19:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:38:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:38:33 compute-0 nova_compute[183177]: 2026-01-26 19:38:33.733 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:34 compute-0 nova_compute[183177]: 2026-01-26 19:38:34.205 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:38 compute-0 nova_compute[183177]: 2026-01-26 19:38:38.735 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:39 compute-0 nova_compute[183177]: 2026-01-26 19:38:39.207 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:41 compute-0 nova_compute[183177]: 2026-01-26 19:38:41.246 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:38:42 compute-0 nova_compute[183177]: 2026-01-26 19:38:42.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:38:42 compute-0 nova_compute[183177]: 2026-01-26 19:38:42.665 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:38:42 compute-0 nova_compute[183177]: 2026-01-26 19:38:42.665 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:38:42 compute-0 nova_compute[183177]: 2026-01-26 19:38:42.666 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:38:42 compute-0 nova_compute[183177]: 2026-01-26 19:38:42.666 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:38:42 compute-0 nova_compute[183177]: 2026-01-26 19:38:42.873 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:38:42 compute-0 nova_compute[183177]: 2026-01-26 19:38:42.875 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:38:42 compute-0 nova_compute[183177]: 2026-01-26 19:38:42.892 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:38:42 compute-0 nova_compute[183177]: 2026-01-26 19:38:42.893 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5824MB free_disk=73.10329818725586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:38:42 compute-0 nova_compute[183177]: 2026-01-26 19:38:42.894 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:38:42 compute-0 nova_compute[183177]: 2026-01-26 19:38:42.894 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:38:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:43.333 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:b9:49 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02893814-74cb-419e-9539-9a1c8c79b4be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '016b50a9944a48cc96f3b5dca58e6a4b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=541eb2a4-ee15-4d4c-b84c-5746be40fade, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=8041d9b1-f920-4d7e-a0aa-621f944e098e) old=Port_Binding(mac=['fa:16:3e:7c:b9:49'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02893814-74cb-419e-9539-9a1c8c79b4be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '016b50a9944a48cc96f3b5dca58e6a4b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:38:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:43.334 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 8041d9b1-f920-4d7e-a0aa-621f944e098e in datapath 02893814-74cb-419e-9539-9a1c8c79b4be updated
Jan 26 19:38:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:43.335 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 02893814-74cb-419e-9539-9a1c8c79b4be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:38:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:43.336 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[64626c93-85ce-4189-b4e5-8c68e0239b44]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:38:43 compute-0 nova_compute[183177]: 2026-01-26 19:38:43.736 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:43 compute-0 nova_compute[183177]: 2026-01-26 19:38:43.986 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:38:43 compute-0 nova_compute[183177]: 2026-01-26 19:38:43.986 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:38:42 up  1:03,  0 user,  load average: 0.20, 0.33, 0.49\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:38:44 compute-0 nova_compute[183177]: 2026-01-26 19:38:44.030 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:38:44 compute-0 nova_compute[183177]: 2026-01-26 19:38:44.209 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:44 compute-0 nova_compute[183177]: 2026-01-26 19:38:44.540 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:38:45 compute-0 nova_compute[183177]: 2026-01-26 19:38:45.050 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:38:45 compute-0 nova_compute[183177]: 2026-01-26 19:38:45.050 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.156s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:38:46 compute-0 podman[205647]: 2026-01-26 19:38:46.360140868 +0000 UTC m=+0.106047538 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 26 19:38:47 compute-0 nova_compute[183177]: 2026-01-26 19:38:47.047 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:38:47 compute-0 nova_compute[183177]: 2026-01-26 19:38:47.048 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:38:47 compute-0 nova_compute[183177]: 2026-01-26 19:38:47.048 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:38:47 compute-0 nova_compute[183177]: 2026-01-26 19:38:47.048 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:38:47 compute-0 nova_compute[183177]: 2026-01-26 19:38:47.049 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:38:47 compute-0 nova_compute[183177]: 2026-01-26 19:38:47.049 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:38:47 compute-0 nova_compute[183177]: 2026-01-26 19:38:47.050 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:38:48 compute-0 nova_compute[183177]: 2026-01-26 19:38:48.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:38:48 compute-0 nova_compute[183177]: 2026-01-26 19:38:48.737 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:49 compute-0 nova_compute[183177]: 2026-01-26 19:38:49.211 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:49 compute-0 podman[205674]: 2026-01-26 19:38:49.329637416 +0000 UTC m=+0.068868817 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 19:38:49 compute-0 podman[205673]: 2026-01-26 19:38:49.3377595 +0000 UTC m=+0.083944284 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, config_id=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Jan 26 19:38:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:52.258 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:38:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:52.258 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:38:52 compute-0 nova_compute[183177]: 2026-01-26 19:38:52.259 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:53.387 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:17:e5 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-80324d21-9187-45dd-a72b-56436ad69756', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80324d21-9187-45dd-a72b-56436ad69756', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '577ae27ca8cf44549308a35c420ae86d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=548c53ed-3e57-4a2f-8db0-c2d59964982f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0eaf9a24-b8b2-4279-aac7-05bc9527dc1c) old=Port_Binding(mac=['fa:16:3e:6b:17:e5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-80324d21-9187-45dd-a72b-56436ad69756', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80324d21-9187-45dd-a72b-56436ad69756', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '577ae27ca8cf44549308a35c420ae86d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:38:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:53.388 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0eaf9a24-b8b2-4279-aac7-05bc9527dc1c in datapath 80324d21-9187-45dd-a72b-56436ad69756 updated
Jan 26 19:38:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:53.389 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 80324d21-9187-45dd-a72b-56436ad69756, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:38:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:38:53.390 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b98889f5-3dfc-4b60-87c8-cad43939b661]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:38:53 compute-0 nova_compute[183177]: 2026-01-26 19:38:53.741 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:54 compute-0 nova_compute[183177]: 2026-01-26 19:38:54.214 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:56 compute-0 sshd-session[205712]: Invalid user oracle from 193.32.162.151 port 43956
Jan 26 19:38:58 compute-0 podman[205714]: 2026-01-26 19:38:58.312762648 +0000 UTC m=+0.065529629 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 19:38:58 compute-0 nova_compute[183177]: 2026-01-26 19:38:58.743 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:59 compute-0 nova_compute[183177]: 2026-01-26 19:38:59.217 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:38:59 compute-0 podman[192499]: time="2026-01-26T19:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:38:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:38:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 26 19:38:59 compute-0 sshd-session[205712]: Connection closed by invalid user oracle 193.32.162.151 port 43956 [preauth]
Jan 26 19:39:01 compute-0 openstack_network_exporter[195363]: ERROR   19:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:39:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:39:01 compute-0 openstack_network_exporter[195363]: ERROR   19:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:39:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:39:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:02.260 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:39:03 compute-0 ovn_controller[95396]: 2026-01-26T19:39:03Z|00057|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 26 19:39:03 compute-0 nova_compute[183177]: 2026-01-26 19:39:03.746 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:04 compute-0 nova_compute[183177]: 2026-01-26 19:39:04.218 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:08 compute-0 nova_compute[183177]: 2026-01-26 19:39:08.749 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:09 compute-0 nova_compute[183177]: 2026-01-26 19:39:09.220 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:13 compute-0 nova_compute[183177]: 2026-01-26 19:39:13.651 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:39:13 compute-0 nova_compute[183177]: 2026-01-26 19:39:13.651 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:39:13 compute-0 nova_compute[183177]: 2026-01-26 19:39:13.751 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:14 compute-0 nova_compute[183177]: 2026-01-26 19:39:14.159 183181 DEBUG nova.compute.manager [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 19:39:14 compute-0 nova_compute[183177]: 2026-01-26 19:39:14.222 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:14 compute-0 nova_compute[183177]: 2026-01-26 19:39:14.711 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:39:14 compute-0 nova_compute[183177]: 2026-01-26 19:39:14.711 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:39:14 compute-0 nova_compute[183177]: 2026-01-26 19:39:14.723 183181 DEBUG nova.virt.hardware [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 19:39:14 compute-0 nova_compute[183177]: 2026-01-26 19:39:14.724 183181 INFO nova.compute.claims [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Claim successful on node compute-0.ctlplane.example.com
Jan 26 19:39:15 compute-0 nova_compute[183177]: 2026-01-26 19:39:15.815 183181 DEBUG nova.compute.provider_tree [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:39:16 compute-0 nova_compute[183177]: 2026-01-26 19:39:16.324 183181 DEBUG nova.scheduler.client.report [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:39:16 compute-0 nova_compute[183177]: 2026-01-26 19:39:16.834 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.123s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:39:16 compute-0 nova_compute[183177]: 2026-01-26 19:39:16.836 183181 DEBUG nova.compute.manager [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 19:39:17 compute-0 nova_compute[183177]: 2026-01-26 19:39:17.349 183181 DEBUG nova.compute.manager [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 19:39:17 compute-0 nova_compute[183177]: 2026-01-26 19:39:17.350 183181 DEBUG nova.network.neutron [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 19:39:17 compute-0 nova_compute[183177]: 2026-01-26 19:39:17.351 183181 WARNING neutronclient.v2_0.client [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:39:17 compute-0 nova_compute[183177]: 2026-01-26 19:39:17.351 183181 WARNING neutronclient.v2_0.client [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:39:17 compute-0 podman[205740]: 2026-01-26 19:39:17.403305403 +0000 UTC m=+0.140656080 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 26 19:39:17 compute-0 nova_compute[183177]: 2026-01-26 19:39:17.860 183181 INFO nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 19:39:18 compute-0 nova_compute[183177]: 2026-01-26 19:39:18.085 183181 DEBUG nova.network.neutron [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Successfully created port: 060e9277-c0a7-426c-af54-216da387f47d _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 19:39:18 compute-0 nova_compute[183177]: 2026-01-26 19:39:18.372 183181 DEBUG nova.compute.manager [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 19:39:18 compute-0 nova_compute[183177]: 2026-01-26 19:39:18.753 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.224 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.398 183181 DEBUG nova.compute.manager [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.400 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.401 183181 INFO nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Creating image(s)
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.402 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "/var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.403 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "/var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.404 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "/var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.405 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.413 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.416 183181 DEBUG oslo_concurrency.processutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.488 183181 DEBUG oslo_concurrency.processutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.490 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.491 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.492 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.499 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.500 183181 DEBUG oslo_concurrency.processutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.561 183181 DEBUG oslo_concurrency.processutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:39:19 compute-0 nova_compute[183177]: 2026-01-26 19:39:19.563 183181 DEBUG oslo_concurrency.processutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:39:20 compute-0 podman[205777]: 2026-01-26 19:39:20.361926388 +0000 UTC m=+0.090908607 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Jan 26 19:39:20 compute-0 podman[205776]: 2026-01-26 19:39:20.362648946 +0000 UTC m=+0.089870729 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Jan 26 19:39:20 compute-0 nova_compute[183177]: 2026-01-26 19:39:20.740 183181 DEBUG oslo_concurrency.processutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk 1073741824" returned: 0 in 1.178s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:39:20 compute-0 nova_compute[183177]: 2026-01-26 19:39:20.742 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.251s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:39:20 compute-0 nova_compute[183177]: 2026-01-26 19:39:20.742 183181 DEBUG oslo_concurrency.processutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:39:20 compute-0 nova_compute[183177]: 2026-01-26 19:39:20.799 183181 DEBUG oslo_concurrency.processutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:39:20 compute-0 nova_compute[183177]: 2026-01-26 19:39:20.800 183181 DEBUG nova.virt.disk.api [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Checking if we can resize image /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:39:20 compute-0 nova_compute[183177]: 2026-01-26 19:39:20.801 183181 DEBUG oslo_concurrency.processutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:39:20 compute-0 nova_compute[183177]: 2026-01-26 19:39:20.856 183181 DEBUG oslo_concurrency.processutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:39:20 compute-0 nova_compute[183177]: 2026-01-26 19:39:20.858 183181 DEBUG nova.virt.disk.api [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Cannot resize image /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:39:20 compute-0 nova_compute[183177]: 2026-01-26 19:39:20.859 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 19:39:20 compute-0 nova_compute[183177]: 2026-01-26 19:39:20.859 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Ensure instance console log exists: /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 19:39:20 compute-0 nova_compute[183177]: 2026-01-26 19:39:20.860 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:39:20 compute-0 nova_compute[183177]: 2026-01-26 19:39:20.860 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:39:20 compute-0 nova_compute[183177]: 2026-01-26 19:39:20.861 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:39:21 compute-0 nova_compute[183177]: 2026-01-26 19:39:21.562 183181 DEBUG nova.compute.manager [req-0c548533-51d5-4bd1-b47b-41c452cecc4f req-ed91a0b6-35dd-4789-b9c1-7d0b7b46dd1e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received event network-changed-060e9277-c0a7-426c-af54-216da387f47d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:39:21 compute-0 nova_compute[183177]: 2026-01-26 19:39:21.562 183181 DEBUG nova.compute.manager [req-0c548533-51d5-4bd1-b47b-41c452cecc4f req-ed91a0b6-35dd-4789-b9c1-7d0b7b46dd1e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Refreshing instance network info cache due to event network-changed-060e9277-c0a7-426c-af54-216da387f47d. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:39:21 compute-0 nova_compute[183177]: 2026-01-26 19:39:21.563 183181 DEBUG oslo_concurrency.lockutils [req-0c548533-51d5-4bd1-b47b-41c452cecc4f req-ed91a0b6-35dd-4789-b9c1-7d0b7b46dd1e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:39:21 compute-0 nova_compute[183177]: 2026-01-26 19:39:21.563 183181 DEBUG oslo_concurrency.lockutils [req-0c548533-51d5-4bd1-b47b-41c452cecc4f req-ed91a0b6-35dd-4789-b9c1-7d0b7b46dd1e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:39:21 compute-0 nova_compute[183177]: 2026-01-26 19:39:21.563 183181 DEBUG nova.network.neutron [req-0c548533-51d5-4bd1-b47b-41c452cecc4f req-ed91a0b6-35dd-4789-b9c1-7d0b7b46dd1e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Refreshing network info cache for port 060e9277-c0a7-426c-af54-216da387f47d _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:39:21 compute-0 nova_compute[183177]: 2026-01-26 19:39:21.567 183181 DEBUG nova.network.neutron [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Successfully updated port: 060e9277-c0a7-426c-af54-216da387f47d _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 19:39:22 compute-0 nova_compute[183177]: 2026-01-26 19:39:22.073 183181 WARNING neutronclient.v2_0.client [req-0c548533-51d5-4bd1-b47b-41c452cecc4f req-ed91a0b6-35dd-4789-b9c1-7d0b7b46dd1e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:39:22 compute-0 nova_compute[183177]: 2026-01-26 19:39:22.078 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:39:22 compute-0 nova_compute[183177]: 2026-01-26 19:39:22.213 183181 DEBUG nova.network.neutron [req-0c548533-51d5-4bd1-b47b-41c452cecc4f req-ed91a0b6-35dd-4789-b9c1-7d0b7b46dd1e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:39:22 compute-0 nova_compute[183177]: 2026-01-26 19:39:22.372 183181 DEBUG nova.network.neutron [req-0c548533-51d5-4bd1-b47b-41c452cecc4f req-ed91a0b6-35dd-4789-b9c1-7d0b7b46dd1e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:39:22 compute-0 nova_compute[183177]: 2026-01-26 19:39:22.901 183181 DEBUG oslo_concurrency.lockutils [req-0c548533-51d5-4bd1-b47b-41c452cecc4f req-ed91a0b6-35dd-4789-b9c1-7d0b7b46dd1e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:39:22 compute-0 nova_compute[183177]: 2026-01-26 19:39:22.902 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquired lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:39:22 compute-0 nova_compute[183177]: 2026-01-26 19:39:22.902 183181 DEBUG nova.network.neutron [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:39:23 compute-0 nova_compute[183177]: 2026-01-26 19:39:23.657 183181 DEBUG nova.network.neutron [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:39:23 compute-0 nova_compute[183177]: 2026-01-26 19:39:23.756 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.005 183181 WARNING neutronclient.v2_0.client [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:39:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:24.033 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:39:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:24.033 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:39:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:24.034 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.178 183181 DEBUG nova.network.neutron [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Updating instance_info_cache with network_info: [{"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.225 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.715 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Releasing lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.715 183181 DEBUG nova.compute.manager [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Instance network_info: |[{"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.732 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Start _get_guest_xml network_info=[{"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.736 183181 WARNING nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.739 183181 DEBUG nova.virt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-174497542', uuid='04802a55-668d-42ba-bc20-72c2e3f29298'), owner=OwnerMeta(userid='ee1e0029a6ac4b56b09c165dc3cd4dda', username='tempest-TestExecuteActionsViaActuator-1232791976-project-admin', projectid='577ae27ca8cf44549308a35c420ae86d', projectname='tempest-TestExecuteActionsViaActuator-1232791976'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769456364.738901) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.744 183181 DEBUG nova.virt.libvirt.host [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.745 183181 DEBUG nova.virt.libvirt.host [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.750 183181 DEBUG nova.virt.libvirt.host [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.751 183181 DEBUG nova.virt.libvirt.host [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.753 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.753 183181 DEBUG nova.virt.hardware [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.754 183181 DEBUG nova.virt.hardware [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.754 183181 DEBUG nova.virt.hardware [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.755 183181 DEBUG nova.virt.hardware [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.755 183181 DEBUG nova.virt.hardware [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.755 183181 DEBUG nova.virt.hardware [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.756 183181 DEBUG nova.virt.hardware [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.756 183181 DEBUG nova.virt.hardware [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.757 183181 DEBUG nova.virt.hardware [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.757 183181 DEBUG nova.virt.hardware [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.757 183181 DEBUG nova.virt.hardware [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.763 183181 DEBUG nova.virt.libvirt.vif [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:39:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-174497542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-174497542',id=4,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-xnbzkhzb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:39:18Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=04802a55-668d-42ba-bc20-72c2e3f29298,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.764 183181 DEBUG nova.network.os_vif_util [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converting VIF {"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.765 183181 DEBUG nova.network.os_vif_util [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:39:24 compute-0 nova_compute[183177]: 2026-01-26 19:39:24.766 183181 DEBUG nova.objects.instance [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lazy-loading 'pci_devices' on Instance uuid 04802a55-668d-42ba-bc20-72c2e3f29298 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.351 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] End _get_guest_xml xml=<domain type="kvm">
Jan 26 19:39:25 compute-0 nova_compute[183177]:   <uuid>04802a55-668d-42ba-bc20-72c2e3f29298</uuid>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   <name>instance-00000004</name>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-174497542</nova:name>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:39:24</nova:creationTime>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:39:25 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:39:25 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:user uuid="ee1e0029a6ac4b56b09c165dc3cd4dda">tempest-TestExecuteActionsViaActuator-1232791976-project-admin</nova:user>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:project uuid="577ae27ca8cf44549308a35c420ae86d">tempest-TestExecuteActionsViaActuator-1232791976</nova:project>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         <nova:port uuid="060e9277-c0a7-426c-af54-216da387f47d">
Jan 26 19:39:25 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <system>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <entry name="serial">04802a55-668d-42ba-bc20-72c2e3f29298</entry>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <entry name="uuid">04802a55-668d-42ba-bc20-72c2e3f29298</entry>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     </system>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   <os>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   </os>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   <features>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   </features>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.config"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:09:71:c5"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <target dev="tap060e9277-c0"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/console.log" append="off"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <video>
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     </video>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:39:25 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:39:25 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:39:25 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:39:25 compute-0 nova_compute[183177]: </domain>
Jan 26 19:39:25 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.353 183181 DEBUG nova.compute.manager [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Preparing to wait for external event network-vif-plugged-060e9277-c0a7-426c-af54-216da387f47d prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.354 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.354 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.355 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.356 183181 DEBUG nova.virt.libvirt.vif [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:39:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-174497542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-174497542',id=4,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-xnbzkhzb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:39:18Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=04802a55-668d-42ba-bc20-72c2e3f29298,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.357 183181 DEBUG nova.network.os_vif_util [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converting VIF {"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.358 183181 DEBUG nova.network.os_vif_util [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.358 183181 DEBUG os_vif [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.360 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.360 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.361 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.362 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.363 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd875766e-996a-59f6-8c58-a663b46c2676', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.364 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.367 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.372 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.372 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap060e9277-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.372 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap060e9277-c0, col_values=(('qos', UUID('4a780890-c1c2-4d36-8b76-7b2996e38352')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.373 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap060e9277-c0, col_values=(('external_ids', {'iface-id': '060e9277-c0a7-426c-af54-216da387f47d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:09:71:c5', 'vm-uuid': '04802a55-668d-42ba-bc20-72c2e3f29298'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.375 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:25 compute-0 NetworkManager[55489]: <info>  [1769456365.3766] manager: (tap060e9277-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.377 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.384 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:25 compute-0 nova_compute[183177]: 2026-01-26 19:39:25.386 183181 INFO os_vif [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0')
Jan 26 19:39:26 compute-0 nova_compute[183177]: 2026-01-26 19:39:26.941 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:39:26 compute-0 nova_compute[183177]: 2026-01-26 19:39:26.942 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:39:26 compute-0 nova_compute[183177]: 2026-01-26 19:39:26.942 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] No VIF found with MAC fa:16:3e:09:71:c5, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 19:39:26 compute-0 nova_compute[183177]: 2026-01-26 19:39:26.943 183181 INFO nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Using config drive
Jan 26 19:39:27 compute-0 nova_compute[183177]: 2026-01-26 19:39:27.455 183181 WARNING neutronclient.v2_0.client [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:39:28 compute-0 nova_compute[183177]: 2026-01-26 19:39:28.440 183181 INFO nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Creating config drive at /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.config
Jan 26 19:39:28 compute-0 nova_compute[183177]: 2026-01-26 19:39:28.447 183181 DEBUG oslo_concurrency.processutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpz16nc63a execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:39:28 compute-0 nova_compute[183177]: 2026-01-26 19:39:28.574 183181 DEBUG oslo_concurrency.processutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpz16nc63a" returned: 0 in 0.127s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:39:28 compute-0 kernel: tap060e9277-c0: entered promiscuous mode
Jan 26 19:39:28 compute-0 NetworkManager[55489]: <info>  [1769456368.6751] manager: (tap060e9277-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Jan 26 19:39:28 compute-0 ovn_controller[95396]: 2026-01-26T19:39:28Z|00058|binding|INFO|Claiming lport 060e9277-c0a7-426c-af54-216da387f47d for this chassis.
Jan 26 19:39:28 compute-0 ovn_controller[95396]: 2026-01-26T19:39:28Z|00059|binding|INFO|060e9277-c0a7-426c-af54-216da387f47d: Claiming fa:16:3e:09:71:c5 10.100.0.9
Jan 26 19:39:28 compute-0 nova_compute[183177]: 2026-01-26 19:39:28.679 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:28 compute-0 nova_compute[183177]: 2026-01-26 19:39:28.689 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.707 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:71:c5 10.100.0.9'], port_security=['fa:16:3e:09:71:c5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '04802a55-668d-42ba-bc20-72c2e3f29298', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02893814-74cb-419e-9539-9a1c8c79b4be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '577ae27ca8cf44549308a35c420ae86d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e6bd7c8f-1625-4087-a64f-b17674d29af3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=541eb2a4-ee15-4d4c-b84c-5746be40fade, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=060e9277-c0a7-426c-af54-216da387f47d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.708 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 060e9277-c0a7-426c-af54-216da387f47d in datapath 02893814-74cb-419e-9539-9a1c8c79b4be bound to our chassis
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.709 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 02893814-74cb-419e-9539-9a1c8c79b4be
Jan 26 19:39:28 compute-0 systemd-udevd[205853]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.727 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[33994ce9-0779-4dbb-b5a1-6d43caebd32e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.728 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap02893814-71 in ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.729 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap02893814-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.729 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[fa320015-ff32-4a8e-b34e-eca88dd821be]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.730 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6a547fa6-e540-4978-ba05-1cedef1e01cd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:28 compute-0 NetworkManager[55489]: <info>  [1769456368.7532] device (tap060e9277-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.752 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[9bdcbec1-47af-4d20-92f1-be3901ffefb8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:28 compute-0 NetworkManager[55489]: <info>  [1769456368.7548] device (tap060e9277-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:39:28 compute-0 systemd-machined[154465]: New machine qemu-3-instance-00000004.
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.774 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[662f2eff-f533-499c-81f6-b980c8aea336]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:28 compute-0 nova_compute[183177]: 2026-01-26 19:39:28.777 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:28 compute-0 nova_compute[183177]: 2026-01-26 19:39:28.780 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:28 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Jan 26 19:39:28 compute-0 ovn_controller[95396]: 2026-01-26T19:39:28Z|00060|binding|INFO|Setting lport 060e9277-c0a7-426c-af54-216da387f47d ovn-installed in OVS
Jan 26 19:39:28 compute-0 ovn_controller[95396]: 2026-01-26T19:39:28Z|00061|binding|INFO|Setting lport 060e9277-c0a7-426c-af54-216da387f47d up in Southbound
Jan 26 19:39:28 compute-0 nova_compute[183177]: 2026-01-26 19:39:28.787 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.825 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e3ea33-6443-406c-abe6-8c86557da9c2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:28 compute-0 podman[205835]: 2026-01-26 19:39:28.829710215 +0000 UTC m=+0.157387011 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 19:39:28 compute-0 systemd-udevd[205861]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:39:28 compute-0 NetworkManager[55489]: <info>  [1769456368.8335] manager: (tap02893814-70): new Veth device (/org/freedesktop/NetworkManager/Devices/30)
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.832 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[ea4ce56a-07c5-47e6-b54e-4aa625d6fa67]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.878 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[c6508af6-7d89-40ca-98bc-cd8f9865a40f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.882 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[7e35b9ec-90d2-4a33-bbd2-765399dcc217]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:28 compute-0 NetworkManager[55489]: <info>  [1769456368.9255] device (tap02893814-70): carrier: link connected
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.933 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[55b28f9a-44ba-41b2-8007-1c07bee811aa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.954 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b6ad1573-80bd-4054-b914-860fa0df507b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02893814-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:b9:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382781, 'reachable_time': 18240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 205898, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:28.980 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1611870e-e8c9-4cef-ae47-58bc1b816a9f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:b949'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382781, 'tstamp': 382781}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 205899, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.007 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f14543-bc9c-4592-9278-c62f8b6fbedf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02893814-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:b9:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382781, 'reachable_time': 18240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 205900, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.048 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc01dc6-3b09-4ab4-8854-6ff4c55da730]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.130 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[37da518f-b7be-42d1-b7a8-b33f2a411bdc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.131 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02893814-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.131 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.131 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02893814-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:39:29 compute-0 nova_compute[183177]: 2026-01-26 19:39:29.133 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:29 compute-0 NetworkManager[55489]: <info>  [1769456369.1343] manager: (tap02893814-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 26 19:39:29 compute-0 kernel: tap02893814-70: entered promiscuous mode
Jan 26 19:39:29 compute-0 nova_compute[183177]: 2026-01-26 19:39:29.139 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.140 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap02893814-70, col_values=(('external_ids', {'iface-id': '8041d9b1-f920-4d7e-a0aa-621f944e098e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:39:29 compute-0 nova_compute[183177]: 2026-01-26 19:39:29.141 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:29 compute-0 ovn_controller[95396]: 2026-01-26T19:39:29Z|00062|binding|INFO|Releasing lport 8041d9b1-f920-4d7e-a0aa-621f944e098e from this chassis (sb_readonly=0)
Jan 26 19:39:29 compute-0 nova_compute[183177]: 2026-01-26 19:39:29.175 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.179 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3311c600-0a9e-49a2-aa5e-c06aeb1d8993]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.180 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.180 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.181 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 02893814-74cb-419e-9539-9a1c8c79b4be disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.181 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.182 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[22238c82-0aa7-47b3-bcdd-0fc3f3e9734a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.183 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.183 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[051b435c-1ff1-4ab2-9204-c8a77d8e1dc5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.184 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: global
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-02893814-74cb-419e-9539-9a1c8c79b4be
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID 02893814-74cb-419e-9539-9a1c8c79b4be
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 19:39:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:39:29.187 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'env', 'PROCESS_TAG=haproxy-02893814-74cb-419e-9539-9a1c8c79b4be', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/02893814-74cb-419e-9539-9a1c8c79b4be.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 19:39:29 compute-0 nova_compute[183177]: 2026-01-26 19:39:29.426 183181 DEBUG nova.compute.manager [req-b93f060f-f789-4102-b952-df4146ca9fa5 req-8c06b646-ac48-4fe5-aafe-c5f991fbd758 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received event network-vif-plugged-060e9277-c0a7-426c-af54-216da387f47d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:39:29 compute-0 nova_compute[183177]: 2026-01-26 19:39:29.427 183181 DEBUG oslo_concurrency.lockutils [req-b93f060f-f789-4102-b952-df4146ca9fa5 req-8c06b646-ac48-4fe5-aafe-c5f991fbd758 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:39:29 compute-0 nova_compute[183177]: 2026-01-26 19:39:29.428 183181 DEBUG oslo_concurrency.lockutils [req-b93f060f-f789-4102-b952-df4146ca9fa5 req-8c06b646-ac48-4fe5-aafe-c5f991fbd758 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:39:29 compute-0 nova_compute[183177]: 2026-01-26 19:39:29.428 183181 DEBUG oslo_concurrency.lockutils [req-b93f060f-f789-4102-b952-df4146ca9fa5 req-8c06b646-ac48-4fe5-aafe-c5f991fbd758 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:39:29 compute-0 nova_compute[183177]: 2026-01-26 19:39:29.428 183181 DEBUG nova.compute.manager [req-b93f060f-f789-4102-b952-df4146ca9fa5 req-8c06b646-ac48-4fe5-aafe-c5f991fbd758 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Processing event network-vif-plugged-060e9277-c0a7-426c-af54-216da387f47d _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:39:29 compute-0 podman[192499]: time="2026-01-26T19:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:39:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:39:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Jan 26 19:39:29 compute-0 podman[205932]: 2026-01-26 19:39:29.87095205 +0000 UTC m=+0.061715360 container create 62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:39:29 compute-0 systemd[1]: Started libpod-conmon-62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4.scope.
Jan 26 19:39:29 compute-0 podman[205932]: 2026-01-26 19:39:29.837301327 +0000 UTC m=+0.028064687 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 19:39:29 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:39:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d890729e16e2c74825ae05ba8c1af4eb0bcbe11527267b90a6a53c5176c604d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 19:39:29 compute-0 podman[205932]: 2026-01-26 19:39:29.965772379 +0000 UTC m=+0.156535749 container init 62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 26 19:39:29 compute-0 podman[205932]: 2026-01-26 19:39:29.971692534 +0000 UTC m=+0.162455854 container start 62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:39:29 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[205953]: [NOTICE]   (205959) : New worker (205961) forked
Jan 26 19:39:29 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[205953]: [NOTICE]   (205959) : Loading success.
Jan 26 19:39:30 compute-0 nova_compute[183177]: 2026-01-26 19:39:30.003 183181 DEBUG nova.compute.manager [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:39:30 compute-0 nova_compute[183177]: 2026-01-26 19:39:30.013 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 19:39:30 compute-0 nova_compute[183177]: 2026-01-26 19:39:30.018 183181 INFO nova.virt.libvirt.driver [-] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Instance spawned successfully.
Jan 26 19:39:30 compute-0 nova_compute[183177]: 2026-01-26 19:39:30.018 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 19:39:30 compute-0 nova_compute[183177]: 2026-01-26 19:39:30.375 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:30 compute-0 nova_compute[183177]: 2026-01-26 19:39:30.536 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:39:30 compute-0 nova_compute[183177]: 2026-01-26 19:39:30.537 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:39:30 compute-0 nova_compute[183177]: 2026-01-26 19:39:30.538 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:39:30 compute-0 nova_compute[183177]: 2026-01-26 19:39:30.539 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:39:30 compute-0 nova_compute[183177]: 2026-01-26 19:39:30.540 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:39:30 compute-0 nova_compute[183177]: 2026-01-26 19:39:30.540 183181 DEBUG nova.virt.libvirt.driver [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:39:31 compute-0 nova_compute[183177]: 2026-01-26 19:39:31.053 183181 INFO nova.compute.manager [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Took 11.65 seconds to spawn the instance on the hypervisor.
Jan 26 19:39:31 compute-0 nova_compute[183177]: 2026-01-26 19:39:31.054 183181 DEBUG nova.compute.manager [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 19:39:31 compute-0 openstack_network_exporter[195363]: ERROR   19:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:39:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:39:31 compute-0 openstack_network_exporter[195363]: ERROR   19:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:39:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:39:31 compute-0 nova_compute[183177]: 2026-01-26 19:39:31.501 183181 DEBUG nova.compute.manager [req-42ebb598-c04d-4be3-8457-be913efb2d69 req-1941c210-26dc-4adc-bb00-0dbff3d47eaa 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received event network-vif-plugged-060e9277-c0a7-426c-af54-216da387f47d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:39:31 compute-0 nova_compute[183177]: 2026-01-26 19:39:31.502 183181 DEBUG oslo_concurrency.lockutils [req-42ebb598-c04d-4be3-8457-be913efb2d69 req-1941c210-26dc-4adc-bb00-0dbff3d47eaa 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:39:31 compute-0 nova_compute[183177]: 2026-01-26 19:39:31.502 183181 DEBUG oslo_concurrency.lockutils [req-42ebb598-c04d-4be3-8457-be913efb2d69 req-1941c210-26dc-4adc-bb00-0dbff3d47eaa 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:39:31 compute-0 nova_compute[183177]: 2026-01-26 19:39:31.502 183181 DEBUG oslo_concurrency.lockutils [req-42ebb598-c04d-4be3-8457-be913efb2d69 req-1941c210-26dc-4adc-bb00-0dbff3d47eaa 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:39:31 compute-0 nova_compute[183177]: 2026-01-26 19:39:31.502 183181 DEBUG nova.compute.manager [req-42ebb598-c04d-4be3-8457-be913efb2d69 req-1941c210-26dc-4adc-bb00-0dbff3d47eaa 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] No waiting events found dispatching network-vif-plugged-060e9277-c0a7-426c-af54-216da387f47d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:39:31 compute-0 nova_compute[183177]: 2026-01-26 19:39:31.503 183181 WARNING nova.compute.manager [req-42ebb598-c04d-4be3-8457-be913efb2d69 req-1941c210-26dc-4adc-bb00-0dbff3d47eaa 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received unexpected event network-vif-plugged-060e9277-c0a7-426c-af54-216da387f47d for instance with vm_state active and task_state None.
Jan 26 19:39:31 compute-0 nova_compute[183177]: 2026-01-26 19:39:31.635 183181 INFO nova.compute.manager [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Took 16.97 seconds to build instance.
Jan 26 19:39:32 compute-0 nova_compute[183177]: 2026-01-26 19:39:32.140 183181 DEBUG oslo_concurrency.lockutils [None req-9a33b70a-9ba9-4526-92c3-8943a22be2cd ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.489s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:39:33 compute-0 nova_compute[183177]: 2026-01-26 19:39:33.780 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:35 compute-0 nova_compute[183177]: 2026-01-26 19:39:35.379 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:38 compute-0 nova_compute[183177]: 2026-01-26 19:39:38.783 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:40 compute-0 nova_compute[183177]: 2026-01-26 19:39:40.383 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:42 compute-0 ovn_controller[95396]: 2026-01-26T19:39:42Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:09:71:c5 10.100.0.9
Jan 26 19:39:42 compute-0 ovn_controller[95396]: 2026-01-26T19:39:42Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:09:71:c5 10.100.0.9
Jan 26 19:39:43 compute-0 nova_compute[183177]: 2026-01-26 19:39:43.787 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:44 compute-0 nova_compute[183177]: 2026-01-26 19:39:44.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:39:44 compute-0 nova_compute[183177]: 2026-01-26 19:39:44.673 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:39:44 compute-0 nova_compute[183177]: 2026-01-26 19:39:44.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:39:44 compute-0 nova_compute[183177]: 2026-01-26 19:39:44.675 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:39:44 compute-0 nova_compute[183177]: 2026-01-26 19:39:44.675 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:39:45 compute-0 nova_compute[183177]: 2026-01-26 19:39:45.386 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:45 compute-0 nova_compute[183177]: 2026-01-26 19:39:45.736 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:39:45 compute-0 nova_compute[183177]: 2026-01-26 19:39:45.832 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:39:45 compute-0 nova_compute[183177]: 2026-01-26 19:39:45.834 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:39:45 compute-0 nova_compute[183177]: 2026-01-26 19:39:45.920 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:39:46 compute-0 nova_compute[183177]: 2026-01-26 19:39:46.160 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:39:46 compute-0 nova_compute[183177]: 2026-01-26 19:39:46.162 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:39:46 compute-0 nova_compute[183177]: 2026-01-26 19:39:46.183 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:39:46 compute-0 nova_compute[183177]: 2026-01-26 19:39:46.183 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5652MB free_disk=73.07456970214844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:39:46 compute-0 nova_compute[183177]: 2026-01-26 19:39:46.184 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:39:46 compute-0 nova_compute[183177]: 2026-01-26 19:39:46.184 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:39:47 compute-0 nova_compute[183177]: 2026-01-26 19:39:47.234 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 04802a55-668d-42ba-bc20-72c2e3f29298 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:39:47 compute-0 nova_compute[183177]: 2026-01-26 19:39:47.234 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:39:47 compute-0 nova_compute[183177]: 2026-01-26 19:39:47.235 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:39:46 up  1:04,  0 user,  load average: 0.45, 0.35, 0.48\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_577ae27ca8cf44549308a35c420ae86d': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:39:47 compute-0 nova_compute[183177]: 2026-01-26 19:39:47.279 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:39:47 compute-0 nova_compute[183177]: 2026-01-26 19:39:47.790 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:39:48 compute-0 nova_compute[183177]: 2026-01-26 19:39:48.315 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:39:48 compute-0 nova_compute[183177]: 2026-01-26 19:39:48.316 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.132s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:39:48 compute-0 podman[206001]: 2026-01-26 19:39:48.451056406 +0000 UTC m=+0.186962397 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:39:48 compute-0 nova_compute[183177]: 2026-01-26 19:39:48.789 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:49 compute-0 nova_compute[183177]: 2026-01-26 19:39:49.311 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:39:49 compute-0 nova_compute[183177]: 2026-01-26 19:39:49.312 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:39:49 compute-0 nova_compute[183177]: 2026-01-26 19:39:49.312 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:39:49 compute-0 nova_compute[183177]: 2026-01-26 19:39:49.313 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:39:49 compute-0 nova_compute[183177]: 2026-01-26 19:39:49.313 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:39:49 compute-0 nova_compute[183177]: 2026-01-26 19:39:49.313 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:39:49 compute-0 nova_compute[183177]: 2026-01-26 19:39:49.313 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:39:49 compute-0 nova_compute[183177]: 2026-01-26 19:39:49.314 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:39:50 compute-0 nova_compute[183177]: 2026-01-26 19:39:50.390 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:51 compute-0 podman[206028]: 2026-01-26 19:39:51.337549325 +0000 UTC m=+0.069130166 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:39:51 compute-0 podman[206027]: 2026-01-26 19:39:51.354087229 +0000 UTC m=+0.093151516 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Jan 26 19:39:52 compute-0 nova_compute[183177]: 2026-01-26 19:39:52.150 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:39:53 compute-0 nova_compute[183177]: 2026-01-26 19:39:53.791 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:55 compute-0 nova_compute[183177]: 2026-01-26 19:39:55.394 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:58 compute-0 nova_compute[183177]: 2026-01-26 19:39:58.794 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:39:59 compute-0 podman[206067]: 2026-01-26 19:39:59.314821881 +0000 UTC m=+0.053726221 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:39:59 compute-0 podman[192499]: time="2026-01-26T19:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:39:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:39:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Jan 26 19:40:00 compute-0 nova_compute[183177]: 2026-01-26 19:40:00.397 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:01 compute-0 openstack_network_exporter[195363]: ERROR   19:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:40:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:40:01 compute-0 openstack_network_exporter[195363]: ERROR   19:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:40:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:40:02 compute-0 nova_compute[183177]: 2026-01-26 19:40:02.781 183181 DEBUG nova.compute.manager [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6173
Jan 26 19:40:03 compute-0 nova_compute[183177]: 2026-01-26 19:40:03.316 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:03 compute-0 nova_compute[183177]: 2026-01-26 19:40:03.317 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:03 compute-0 nova_compute[183177]: 2026-01-26 19:40:03.820 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:03 compute-0 nova_compute[183177]: 2026-01-26 19:40:03.832 183181 DEBUG nova.virt.hardware [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 19:40:03 compute-0 nova_compute[183177]: 2026-01-26 19:40:03.832 183181 INFO nova.compute.claims [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Claim successful on node compute-0.ctlplane.example.com
Jan 26 19:40:04 compute-0 nova_compute[183177]: 2026-01-26 19:40:04.343 183181 INFO nova.compute.resource_tracker [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Updating resource usage from migration dd1eb126-ecbc-4427-9f3e-210191098854
Jan 26 19:40:04 compute-0 nova_compute[183177]: 2026-01-26 19:40:04.398 183181 DEBUG nova.compute.provider_tree [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:40:04 compute-0 nova_compute[183177]: 2026-01-26 19:40:04.913 183181 DEBUG nova.scheduler.client.report [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:40:05 compute-0 nova_compute[183177]: 2026-01-26 19:40:05.399 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:05 compute-0 nova_compute[183177]: 2026-01-26 19:40:05.448 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 2.131s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:05 compute-0 nova_compute[183177]: 2026-01-26 19:40:05.448 183181 INFO nova.compute.manager [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Migrating
Jan 26 19:40:05 compute-0 nova_compute[183177]: 2026-01-26 19:40:05.449 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:40:05 compute-0 nova_compute[183177]: 2026-01-26 19:40:05.449 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:40:05 compute-0 nova_compute[183177]: 2026-01-26 19:40:05.978 183181 INFO nova.compute.rpcapi [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Automatically selected compute RPC version 6.4 from minimum service version 70
Jan 26 19:40:05 compute-0 nova_compute[183177]: 2026-01-26 19:40:05.980 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:40:06 compute-0 nova_compute[183177]: 2026-01-26 19:40:06.884 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:40:06 compute-0 nova_compute[183177]: 2026-01-26 19:40:06.885 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:40:06 compute-0 nova_compute[183177]: 2026-01-26 19:40:06.886 183181 DEBUG nova.network.neutron [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:40:07 compute-0 nova_compute[183177]: 2026-01-26 19:40:07.393 183181 WARNING neutronclient.v2_0.client [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:08 compute-0 nova_compute[183177]: 2026-01-26 19:40:08.823 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:09 compute-0 nova_compute[183177]: 2026-01-26 19:40:09.560 183181 WARNING neutronclient.v2_0.client [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:10 compute-0 nova_compute[183177]: 2026-01-26 19:40:10.194 183181 DEBUG nova.network.neutron [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Updating instance_info_cache with network_info: [{"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:40:10 compute-0 nova_compute[183177]: 2026-01-26 19:40:10.401 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:10 compute-0 nova_compute[183177]: 2026-01-26 19:40:10.702 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:40:12 compute-0 nova_compute[183177]: 2026-01-26 19:40:12.262 183181 DEBUG nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12417
Jan 26 19:40:12 compute-0 nova_compute[183177]: 2026-01-26 19:40:12.267 183181 DEBUG nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4247
Jan 26 19:40:13 compute-0 nova_compute[183177]: 2026-01-26 19:40:13.826 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:14 compute-0 kernel: tap060e9277-c0 (unregistering): left promiscuous mode
Jan 26 19:40:14 compute-0 NetworkManager[55489]: <info>  [1769456414.4968] device (tap060e9277-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:40:14 compute-0 nova_compute[183177]: 2026-01-26 19:40:14.512 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:14 compute-0 ovn_controller[95396]: 2026-01-26T19:40:14Z|00063|binding|INFO|Releasing lport 060e9277-c0a7-426c-af54-216da387f47d from this chassis (sb_readonly=0)
Jan 26 19:40:14 compute-0 ovn_controller[95396]: 2026-01-26T19:40:14Z|00064|binding|INFO|Setting lport 060e9277-c0a7-426c-af54-216da387f47d down in Southbound
Jan 26 19:40:14 compute-0 ovn_controller[95396]: 2026-01-26T19:40:14Z|00065|binding|INFO|Removing iface tap060e9277-c0 ovn-installed in OVS
Jan 26 19:40:14 compute-0 nova_compute[183177]: 2026-01-26 19:40:14.515 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.541 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:71:c5 10.100.0.9'], port_security=['fa:16:3e:09:71:c5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '04802a55-668d-42ba-bc20-72c2e3f29298', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02893814-74cb-419e-9539-9a1c8c79b4be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '577ae27ca8cf44549308a35c420ae86d', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e6bd7c8f-1625-4087-a64f-b17674d29af3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=541eb2a4-ee15-4d4c-b84c-5746be40fade, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=060e9277-c0a7-426c-af54-216da387f47d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.542 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 060e9277-c0a7-426c-af54-216da387f47d in datapath 02893814-74cb-419e-9539-9a1c8c79b4be unbound from our chassis
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.544 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 02893814-74cb-419e-9539-9a1c8c79b4be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.546 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a293f8-de8f-4e4e-a8e8-7be3c1e078b4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.547 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be namespace which is not needed anymore
Jan 26 19:40:14 compute-0 nova_compute[183177]: 2026-01-26 19:40:14.553 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:14 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 26 19:40:14 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Consumed 15.362s CPU time.
Jan 26 19:40:14 compute-0 systemd-machined[154465]: Machine qemu-3-instance-00000004 terminated.
Jan 26 19:40:14 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[205953]: [NOTICE]   (205959) : haproxy version is 3.0.5-8e879a5
Jan 26 19:40:14 compute-0 podman[206116]: 2026-01-26 19:40:14.726760536 +0000 UTC m=+0.047853817 container kill 62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:40:14 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[205953]: [NOTICE]   (205959) : path to executable is /usr/sbin/haproxy
Jan 26 19:40:14 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[205953]: [WARNING]  (205959) : Exiting Master process...
Jan 26 19:40:14 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[205953]: [ALERT]    (205959) : Current worker (205961) exited with code 143 (Terminated)
Jan 26 19:40:14 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[205953]: [WARNING]  (205959) : All workers exited. Exiting... (0)
Jan 26 19:40:14 compute-0 systemd[1]: libpod-62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4.scope: Deactivated successfully.
Jan 26 19:40:14 compute-0 nova_compute[183177]: 2026-01-26 19:40:14.788 183181 DEBUG nova.compute.manager [req-8c7440ac-f4f9-44ac-899e-89450484f215 req-23858658-ab76-4b13-a667-22bf349862f1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received event network-vif-unplugged-060e9277-c0a7-426c-af54-216da387f47d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:40:14 compute-0 nova_compute[183177]: 2026-01-26 19:40:14.788 183181 DEBUG oslo_concurrency.lockutils [req-8c7440ac-f4f9-44ac-899e-89450484f215 req-23858658-ab76-4b13-a667-22bf349862f1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:14 compute-0 nova_compute[183177]: 2026-01-26 19:40:14.789 183181 DEBUG oslo_concurrency.lockutils [req-8c7440ac-f4f9-44ac-899e-89450484f215 req-23858658-ab76-4b13-a667-22bf349862f1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:14 compute-0 nova_compute[183177]: 2026-01-26 19:40:14.789 183181 DEBUG oslo_concurrency.lockutils [req-8c7440ac-f4f9-44ac-899e-89450484f215 req-23858658-ab76-4b13-a667-22bf349862f1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:14 compute-0 nova_compute[183177]: 2026-01-26 19:40:14.789 183181 DEBUG nova.compute.manager [req-8c7440ac-f4f9-44ac-899e-89450484f215 req-23858658-ab76-4b13-a667-22bf349862f1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] No waiting events found dispatching network-vif-unplugged-060e9277-c0a7-426c-af54-216da387f47d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:40:14 compute-0 nova_compute[183177]: 2026-01-26 19:40:14.789 183181 WARNING nova.compute.manager [req-8c7440ac-f4f9-44ac-899e-89450484f215 req-23858658-ab76-4b13-a667-22bf349862f1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received unexpected event network-vif-unplugged-060e9277-c0a7-426c-af54-216da387f47d for instance with vm_state active and task_state resize_migrating.
Jan 26 19:40:14 compute-0 podman[206136]: 2026-01-26 19:40:14.804350472 +0000 UTC m=+0.044749396 container died 62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Jan 26 19:40:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4-userdata-shm.mount: Deactivated successfully.
Jan 26 19:40:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-9d890729e16e2c74825ae05ba8c1af4eb0bcbe11527267b90a6a53c5176c604d-merged.mount: Deactivated successfully.
Jan 26 19:40:14 compute-0 podman[206136]: 2026-01-26 19:40:14.850260777 +0000 UTC m=+0.090659611 container cleanup 62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 19:40:14 compute-0 systemd[1]: libpod-conmon-62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4.scope: Deactivated successfully.
Jan 26 19:40:14 compute-0 podman[206140]: 2026-01-26 19:40:14.872633864 +0000 UTC m=+0.099615505 container remove 62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.878 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[cb9d470c-ae3b-4f1a-8861-a3164f17571d]: (4, ("Mon Jan 26 07:40:14 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be (62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4)\n62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4\nMon Jan 26 07:40:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be (62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4)\n62ac77315918735eb40853437a47d344a423d3b5173064eb7091c62524845fa4\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.879 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[c74a71e9-4ada-4d20-b9ec-731b12e8e499]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.880 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.880 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3df5806b-fe99-4dbb-a124-a161b2e9e0a1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.881 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02893814-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:14 compute-0 nova_compute[183177]: 2026-01-26 19:40:14.884 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:14 compute-0 kernel: tap02893814-70: left promiscuous mode
Jan 26 19:40:14 compute-0 nova_compute[183177]: 2026-01-26 19:40:14.903 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.906 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a3b2efde-bb0b-4675-a238-176730c728b5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.919 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[51bf961f-03ff-4d7e-84b8-ca39551a8961]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.920 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e7abd6ad-c413-4127-88aa-1f05cafe562e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.938 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[4e57e36b-d9e5-4ce9-b470-fd4b86226422]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382770, 'reachable_time': 33911, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206183, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.943 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 19:40:14 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:14.944 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[58bb7f70-3069-4f4d-8280-de5547e03148]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:14 compute-0 systemd[1]: run-netns-ovnmeta\x2d02893814\x2d74cb\x2d419e\x2d9539\x2d9a1c8c79b4be.mount: Deactivated successfully.
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.287 183181 INFO nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Instance shutdown successfully after 3 seconds.
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.295 183181 INFO nova.virt.libvirt.driver [-] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Instance destroyed successfully.
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.297 183181 DEBUG nova.virt.libvirt.vif [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:39:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-174497542',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-174497542',id=4,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:39:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-xnbzkhzb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:40:02Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=04802a55-668d-42ba-bc20-72c2e3f29298,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "vif_mac": "fa:16:3e:09:71:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.297 183181 DEBUG nova.network.os_vif_util [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "vif_mac": "fa:16:3e:09:71:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.298 183181 DEBUG nova.network.os_vif_util [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.299 183181 DEBUG os_vif [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.302 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.302 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap060e9277-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.304 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.306 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.307 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.307 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4a780890-c1c2-4d36-8b76-7b2996e38352) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.308 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.309 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.311 183181 INFO os_vif [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0')
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.314 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.399 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.401 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.467 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.469 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298_resize/disk /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.506 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "cp -r /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298_resize/disk /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.513 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298_resize/disk.config /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.config execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.547 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "cp -r /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298_resize/disk.config /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.config" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.549 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298_resize/disk.info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.info execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.592 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "cp -r /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298_resize/disk.info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.info" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.594 183181 WARNING neutronclient.v2_0.client [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:15 compute-0 nova_compute[183177]: 2026-01-26 19:40:15.595 183181 WARNING neutronclient.v2_0.client [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:16 compute-0 nova_compute[183177]: 2026-01-26 19:40:16.359 183181 DEBUG nova.network.neutron [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Port 060e9277-c0a7-426c-af54-216da387f47d binding to destination host compute-0.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3231
Jan 26 19:40:16 compute-0 nova_compute[183177]: 2026-01-26 19:40:16.892 183181 DEBUG nova.compute.manager [req-cf89e118-02b5-42c2-a19b-371683251cf4 req-9b6991ad-2e2c-4cf0-8a1b-3d12c6b636b2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received event network-vif-unplugged-060e9277-c0a7-426c-af54-216da387f47d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:40:16 compute-0 nova_compute[183177]: 2026-01-26 19:40:16.893 183181 DEBUG oslo_concurrency.lockutils [req-cf89e118-02b5-42c2-a19b-371683251cf4 req-9b6991ad-2e2c-4cf0-8a1b-3d12c6b636b2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:16 compute-0 nova_compute[183177]: 2026-01-26 19:40:16.893 183181 DEBUG oslo_concurrency.lockutils [req-cf89e118-02b5-42c2-a19b-371683251cf4 req-9b6991ad-2e2c-4cf0-8a1b-3d12c6b636b2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:16 compute-0 nova_compute[183177]: 2026-01-26 19:40:16.894 183181 DEBUG oslo_concurrency.lockutils [req-cf89e118-02b5-42c2-a19b-371683251cf4 req-9b6991ad-2e2c-4cf0-8a1b-3d12c6b636b2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:16 compute-0 nova_compute[183177]: 2026-01-26 19:40:16.894 183181 DEBUG nova.compute.manager [req-cf89e118-02b5-42c2-a19b-371683251cf4 req-9b6991ad-2e2c-4cf0-8a1b-3d12c6b636b2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] No waiting events found dispatching network-vif-unplugged-060e9277-c0a7-426c-af54-216da387f47d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:40:16 compute-0 nova_compute[183177]: 2026-01-26 19:40:16.894 183181 WARNING nova.compute.manager [req-cf89e118-02b5-42c2-a19b-371683251cf4 req-9b6991ad-2e2c-4cf0-8a1b-3d12c6b636b2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received unexpected event network-vif-unplugged-060e9277-c0a7-426c-af54-216da387f47d for instance with vm_state active and task_state resize_migrating.
Jan 26 19:40:17 compute-0 nova_compute[183177]: 2026-01-26 19:40:17.421 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:17 compute-0 nova_compute[183177]: 2026-01-26 19:40:17.422 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:17 compute-0 nova_compute[183177]: 2026-01-26 19:40:17.422 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:18 compute-0 nova_compute[183177]: 2026-01-26 19:40:18.431 183181 WARNING neutronclient.v2_0.client [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:18 compute-0 nova_compute[183177]: 2026-01-26 19:40:18.829 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:19 compute-0 nova_compute[183177]: 2026-01-26 19:40:19.377 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:40:19 compute-0 nova_compute[183177]: 2026-01-26 19:40:19.378 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:40:19 compute-0 nova_compute[183177]: 2026-01-26 19:40:19.378 183181 DEBUG nova.network.neutron [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:40:19 compute-0 podman[206193]: 2026-01-26 19:40:19.395444064 +0000 UTC m=+0.135231139 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:40:19 compute-0 nova_compute[183177]: 2026-01-26 19:40:19.885 183181 WARNING neutronclient.v2_0.client [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:20 compute-0 nova_compute[183177]: 2026-01-26 19:40:20.308 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:21 compute-0 nova_compute[183177]: 2026-01-26 19:40:21.430 183181 WARNING neutronclient.v2_0.client [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:21 compute-0 nova_compute[183177]: 2026-01-26 19:40:21.621 183181 DEBUG nova.network.neutron [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Updating instance_info_cache with network_info: [{"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:40:22 compute-0 nova_compute[183177]: 2026-01-26 19:40:22.129 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:40:22 compute-0 podman[206221]: 2026-01-26 19:40:22.364610404 +0000 UTC m=+0.088994397 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 19:40:22 compute-0 podman[206220]: 2026-01-26 19:40:22.380963062 +0000 UTC m=+0.106442264 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, config_id=openstack_network_exporter)
Jan 26 19:40:22 compute-0 nova_compute[183177]: 2026-01-26 19:40:22.991 183181 DEBUG nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Jan 26 19:40:22 compute-0 nova_compute[183177]: 2026-01-26 19:40:22.993 183181 DEBUG nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Jan 26 19:40:22 compute-0 nova_compute[183177]: 2026-01-26 19:40:22.994 183181 INFO nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Creating image(s)
Jan 26 19:40:22 compute-0 nova_compute[183177]: 2026-01-26 19:40:22.996 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.081 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.081 183181 DEBUG nova.virt.disk.api [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Checking if we can resize image /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.082 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.159 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.160 183181 DEBUG nova.virt.disk.api [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Cannot resize image /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.676 183181 DEBUG nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.677 183181 DEBUG nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Ensure instance console log exists: /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.677 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.678 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.678 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.680 183181 DEBUG nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Start _get_guest_xml network_info=[{"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "vif_mac": "fa:16:3e:09:71:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.685 183181 WARNING nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.686 183181 DEBUG nova.virt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-174497542', uuid='04802a55-668d-42ba-bc20-72c2e3f29298'), owner=OwnerMeta(userid='ee1e0029a6ac4b56b09c165dc3cd4dda', username='tempest-TestExecuteActionsViaActuator-1232791976-project-admin', projectid='577ae27ca8cf44549308a35c420ae86d', projectname='tempest-TestExecuteActionsViaActuator-1232791976'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.micro', flavorid='8f430d2d-ba38-4a38-a47d-7763d17a5dc3', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "vif_mac": "fa:16:3e:09:71:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769456423.686636) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.693 183181 DEBUG nova.virt.libvirt.host [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.694 183181 DEBUG nova.virt.libvirt.host [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.697 183181 DEBUG nova.virt.libvirt.host [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.698 183181 DEBUG nova.virt.libvirt.host [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.699 183181 DEBUG nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.699 183181 DEBUG nova.virt.hardware [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='8f430d2d-ba38-4a38-a47d-7763d17a5dc3',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.700 183181 DEBUG nova.virt.hardware [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.700 183181 DEBUG nova.virt.hardware [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.700 183181 DEBUG nova.virt.hardware [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.700 183181 DEBUG nova.virt.hardware [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.701 183181 DEBUG nova.virt.hardware [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.701 183181 DEBUG nova.virt.hardware [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.701 183181 DEBUG nova.virt.hardware [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.702 183181 DEBUG nova.virt.hardware [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.702 183181 DEBUG nova.virt.hardware [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.702 183181 DEBUG nova.virt.hardware [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.706 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.785 183181 DEBUG oslo_concurrency.processutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.config --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.790 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "/var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.791 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "/var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.793 183181 DEBUG oslo_concurrency.lockutils [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "/var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.795 183181 DEBUG nova.virt.libvirt.vif [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:39:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-174497542',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-174497542',id=4,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:39:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-xnbzkhzb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:40:16Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=04802a55-668d-42ba-bc20-72c2e3f29298,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "vif_mac": "fa:16:3e:09:71:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.795 183181 DEBUG nova.network.os_vif_util [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "vif_mac": "fa:16:3e:09:71:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.797 183181 DEBUG nova.network.os_vif_util [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.804 183181 DEBUG nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] End _get_guest_xml xml=<domain type="kvm">
Jan 26 19:40:23 compute-0 nova_compute[183177]:   <uuid>04802a55-668d-42ba-bc20-72c2e3f29298</uuid>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   <name>instance-00000004</name>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   <memory>196608</memory>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-174497542</nova:name>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:40:23</nova:creationTime>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <nova:flavor name="m1.micro" id="8f430d2d-ba38-4a38-a47d-7763d17a5dc3">
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:memory>192</nova:memory>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:40:23 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:40:23 compute-0 nova_compute[183177]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Jan 26 19:40:23 compute-0 nova_compute[183177]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Jan 26 19:40:23 compute-0 nova_compute[183177]:           <nova:property name="hw_input_bus">usb</nova:property>
Jan 26 19:40:23 compute-0 nova_compute[183177]:           <nova:property name="hw_machine_type">q35</nova:property>
Jan 26 19:40:23 compute-0 nova_compute[183177]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Jan 26 19:40:23 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:40:23 compute-0 nova_compute[183177]:           <nova:property name="hw_video_model">virtio</nova:property>
Jan 26 19:40:23 compute-0 nova_compute[183177]:           <nova:property name="hw_vif_model">virtio</nova:property>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:user uuid="ee1e0029a6ac4b56b09c165dc3cd4dda">tempest-TestExecuteActionsViaActuator-1232791976-project-admin</nova:user>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:project uuid="577ae27ca8cf44549308a35c420ae86d">tempest-TestExecuteActionsViaActuator-1232791976</nova:project>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         <nova:port uuid="060e9277-c0a7-426c-af54-216da387f47d">
Jan 26 19:40:23 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <system>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <entry name="serial">04802a55-668d-42ba-bc20-72c2e3f29298</entry>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <entry name="uuid">04802a55-668d-42ba-bc20-72c2e3f29298</entry>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     </system>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   <os>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   </os>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   <features>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   </features>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk.config"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:09:71:c5"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <target dev="tap060e9277-c0"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/console.log" append="off"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <video>
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     </video>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:40:23 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:40:23 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:40:23 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:40:23 compute-0 nova_compute[183177]: </domain>
Jan 26 19:40:23 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.805 183181 DEBUG nova.virt.libvirt.vif [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:39:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-174497542',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-174497542',id=4,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:39:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-xnbzkhzb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:40:16Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=04802a55-668d-42ba-bc20-72c2e3f29298,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "vif_mac": "fa:16:3e:09:71:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.806 183181 DEBUG nova.network.os_vif_util [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "vif_mac": "fa:16:3e:09:71:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.807 183181 DEBUG nova.network.os_vif_util [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.807 183181 DEBUG os_vif [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.809 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.810 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.810 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.811 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.812 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd875766e-996a-59f6-8c58-a663b46c2676', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.814 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.816 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.824 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.824 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap060e9277-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.825 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap060e9277-c0, col_values=(('qos', UUID('f819ef1f-2988-4fb3-a751-da7727c0fbeb')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.826 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap060e9277-c0, col_values=(('external_ids', {'iface-id': '060e9277-c0a7-426c-af54-216da387f47d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:09:71:c5', 'vm-uuid': '04802a55-668d-42ba-bc20-72c2e3f29298'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.827 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:23 compute-0 NetworkManager[55489]: <info>  [1769456423.8304] manager: (tap060e9277-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.831 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.835 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:23 compute-0 nova_compute[183177]: 2026-01-26 19:40:23.837 183181 INFO os_vif [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0')
Jan 26 19:40:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:24.035 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:24.035 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:24.035 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.520 183181 DEBUG nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.521 183181 DEBUG nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.521 183181 DEBUG nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] No VIF found with MAC fa:16:3e:09:71:c5, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.522 183181 INFO nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Using config drive
Jan 26 19:40:25 compute-0 kernel: tap060e9277-c0: entered promiscuous mode
Jan 26 19:40:25 compute-0 NetworkManager[55489]: <info>  [1769456425.6035] manager: (tap060e9277-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Jan 26 19:40:25 compute-0 ovn_controller[95396]: 2026-01-26T19:40:25Z|00066|binding|INFO|Claiming lport 060e9277-c0a7-426c-af54-216da387f47d for this chassis.
Jan 26 19:40:25 compute-0 ovn_controller[95396]: 2026-01-26T19:40:25Z|00067|binding|INFO|060e9277-c0a7-426c-af54-216da387f47d: Claiming fa:16:3e:09:71:c5 10.100.0.9
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.607 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.614 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:71:c5 10.100.0.9'], port_security=['fa:16:3e:09:71:c5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '04802a55-668d-42ba-bc20-72c2e3f29298', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02893814-74cb-419e-9539-9a1c8c79b4be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '577ae27ca8cf44549308a35c420ae86d', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'e6bd7c8f-1625-4087-a64f-b17674d29af3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=541eb2a4-ee15-4d4c-b84c-5746be40fade, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=060e9277-c0a7-426c-af54-216da387f47d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.615 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 060e9277-c0a7-426c-af54-216da387f47d in datapath 02893814-74cb-419e-9539-9a1c8c79b4be bound to our chassis
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.617 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 02893814-74cb-419e-9539-9a1c8c79b4be
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.632 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a9b0d8-ae15-4beb-8f65-2624c832a123]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.633 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap02893814-71 in ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.635 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap02893814-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.635 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a9d5671c-ebb1-477f-9dfd-77f1003295af]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 ovn_controller[95396]: 2026-01-26T19:40:25Z|00068|binding|INFO|Setting lport 060e9277-c0a7-426c-af54-216da387f47d up in Southbound
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.636 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ad2ac3-6bfb-486e-9145-0ead4d57ec77]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 ovn_controller[95396]: 2026-01-26T19:40:25Z|00069|binding|INFO|Setting lport 060e9277-c0a7-426c-af54-216da387f47d ovn-installed in OVS
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.637 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.638 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.640 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:25 compute-0 systemd-udevd[206290]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.653 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[ca178fa2-223f-4c01-ab1b-55ef1f87a25c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 NetworkManager[55489]: <info>  [1769456425.6649] device (tap060e9277-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:40:25 compute-0 NetworkManager[55489]: <info>  [1769456425.6657] device (tap060e9277-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:40:25 compute-0 systemd-machined[154465]: New machine qemu-4-instance-00000004.
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.670 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6059442a-03ab-4f0e-afc2-6e8750302fa3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.707 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e6751f-69b7-45e6-8d75-d1baf948ba77]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.715 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[aabdac35-2b17-4bcb-92f1-dd44b1cd562e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 systemd-udevd[206294]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:40:25 compute-0 NetworkManager[55489]: <info>  [1769456425.7175] manager: (tap02893814-70): new Veth device (/org/freedesktop/NetworkManager/Devices/34)
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.757 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[1e78883d-2a58-4bd1-bba2-172585c846d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.761 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[2008c7b0-2332-4119-860f-362db0b1a457]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 NetworkManager[55489]: <info>  [1769456425.7860] device (tap02893814-70): carrier: link connected
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.793 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[e911c4d4-80df-482c-a837-25355874cde3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.813 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[64d373f0-fcd6-4c80-ac19-700c33cc9339]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02893814-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:b9:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388467, 'reachable_time': 43675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206322, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.831 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e132c06d-1422-42a5-8c2b-80d00047a8cf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7c:b949'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388467, 'tstamp': 388467}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206325, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.855 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a6f9d353-be70-4f87-9886-fe37e11d413d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02893814-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:b9:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388467, 'reachable_time': 43675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 206329, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.891 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8e072f-2457-4bbc-a087-6da6539d95e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.956 183181 DEBUG nova.compute.manager [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.959 183181 INFO nova.virt.libvirt.driver [-] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Instance running successfully.
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.958 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a99fa3b2-84e6-4582-b485-e1ad7c318d5b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 virtqemud[182929]: argument unsupported: QEMU guest agent is not configured
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.960 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02893814-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:25 compute-0 kernel: tap02893814-70: entered promiscuous mode
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.960 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:40:25 compute-0 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.961 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02893814-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:25 compute-0 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.961 183181 DEBUG nova.virt.libvirt.guest [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.961 183181 DEBUG nova.virt.libvirt.driver [None req-7d654bf6-0392-4a3e-b935-e2fc4e28499f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.963 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:25 compute-0 NetworkManager[55489]: <info>  [1769456425.9666] manager: (tap02893814-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.966 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.968 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap02893814-70, col_values=(('external_ids', {'iface-id': '8041d9b1-f920-4d7e-a0aa-621f944e098e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:25 compute-0 ovn_controller[95396]: 2026-01-26T19:40:25Z|00070|binding|INFO|Releasing lport 8041d9b1-f920-4d7e-a0aa-621f944e098e from this chassis (sb_readonly=0)
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.970 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.983 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:25 compute-0 nova_compute[183177]: 2026-01-26 19:40:25.984 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.985 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[48bc08e0-d904-40c5-8999-3b4be85f38dc]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.986 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.986 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.986 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 02893814-74cb-419e-9539-9a1c8c79b4be disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.987 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.987 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b34e7431-0cf3-4b93-b4ea-55af945a9a29]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.987 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.988 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[221bdbff-3cfb-4240-b633-6f1b2246a3fb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.988 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: global
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-02893814-74cb-419e-9539-9a1c8c79b4be
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID 02893814-74cb-419e-9539-9a1c8c79b4be
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 19:40:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:25.989 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'env', 'PROCESS_TAG=haproxy-02893814-74cb-419e-9539-9a1c8c79b4be', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/02893814-74cb-419e-9539-9a1c8c79b4be.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 19:40:26 compute-0 podman[206364]: 2026-01-26 19:40:26.466495789 +0000 UTC m=+0.059842741 container create b9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120)
Jan 26 19:40:26 compute-0 systemd[1]: Started libpod-conmon-b9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6.scope.
Jan 26 19:40:26 compute-0 podman[206364]: 2026-01-26 19:40:26.432177189 +0000 UTC m=+0.025524101 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 19:40:26 compute-0 nova_compute[183177]: 2026-01-26 19:40:26.541 183181 DEBUG nova.compute.manager [req-a4e09a65-f159-4e72-b201-1e0e3a8f7faf req-d5ef7897-e57d-4d23-ad07-ba85ca806f8e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received event network-vif-plugged-060e9277-c0a7-426c-af54-216da387f47d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:40:26 compute-0 nova_compute[183177]: 2026-01-26 19:40:26.541 183181 DEBUG oslo_concurrency.lockutils [req-a4e09a65-f159-4e72-b201-1e0e3a8f7faf req-d5ef7897-e57d-4d23-ad07-ba85ca806f8e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:26 compute-0 nova_compute[183177]: 2026-01-26 19:40:26.542 183181 DEBUG oslo_concurrency.lockutils [req-a4e09a65-f159-4e72-b201-1e0e3a8f7faf req-d5ef7897-e57d-4d23-ad07-ba85ca806f8e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:26 compute-0 nova_compute[183177]: 2026-01-26 19:40:26.542 183181 DEBUG oslo_concurrency.lockutils [req-a4e09a65-f159-4e72-b201-1e0e3a8f7faf req-d5ef7897-e57d-4d23-ad07-ba85ca806f8e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:26 compute-0 nova_compute[183177]: 2026-01-26 19:40:26.542 183181 DEBUG nova.compute.manager [req-a4e09a65-f159-4e72-b201-1e0e3a8f7faf req-d5ef7897-e57d-4d23-ad07-ba85ca806f8e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] No waiting events found dispatching network-vif-plugged-060e9277-c0a7-426c-af54-216da387f47d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:40:26 compute-0 nova_compute[183177]: 2026-01-26 19:40:26.543 183181 WARNING nova.compute.manager [req-a4e09a65-f159-4e72-b201-1e0e3a8f7faf req-d5ef7897-e57d-4d23-ad07-ba85ca806f8e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received unexpected event network-vif-plugged-060e9277-c0a7-426c-af54-216da387f47d for instance with vm_state active and task_state resize_finish.
Jan 26 19:40:26 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:40:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22a2e47e432207fb5cdd00439037e488a42d2da2d92634960f74ae3f71fa5fef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 19:40:26 compute-0 podman[206364]: 2026-01-26 19:40:26.590678138 +0000 UTC m=+0.184025090 container init b9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 26 19:40:26 compute-0 podman[206364]: 2026-01-26 19:40:26.601877992 +0000 UTC m=+0.195224944 container start b9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0)
Jan 26 19:40:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:26.614 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:40:26 compute-0 nova_compute[183177]: 2026-01-26 19:40:26.614 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:26 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[206380]: [NOTICE]   (206384) : New worker (206386) forked
Jan 26 19:40:26 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[206380]: [NOTICE]   (206384) : Loading success.
Jan 26 19:40:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:26.694 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:40:28 compute-0 nova_compute[183177]: 2026-01-26 19:40:28.659 183181 DEBUG nova.compute.manager [req-ce577b59-d18e-48d8-80b7-eb5260b169d8 req-36f40cb3-f661-4e76-a549-b0ad894379c0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received event network-vif-plugged-060e9277-c0a7-426c-af54-216da387f47d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:40:28 compute-0 nova_compute[183177]: 2026-01-26 19:40:28.660 183181 DEBUG oslo_concurrency.lockutils [req-ce577b59-d18e-48d8-80b7-eb5260b169d8 req-36f40cb3-f661-4e76-a549-b0ad894379c0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:28 compute-0 nova_compute[183177]: 2026-01-26 19:40:28.660 183181 DEBUG oslo_concurrency.lockutils [req-ce577b59-d18e-48d8-80b7-eb5260b169d8 req-36f40cb3-f661-4e76-a549-b0ad894379c0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:28 compute-0 nova_compute[183177]: 2026-01-26 19:40:28.660 183181 DEBUG oslo_concurrency.lockutils [req-ce577b59-d18e-48d8-80b7-eb5260b169d8 req-36f40cb3-f661-4e76-a549-b0ad894379c0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:28 compute-0 nova_compute[183177]: 2026-01-26 19:40:28.661 183181 DEBUG nova.compute.manager [req-ce577b59-d18e-48d8-80b7-eb5260b169d8 req-36f40cb3-f661-4e76-a549-b0ad894379c0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] No waiting events found dispatching network-vif-plugged-060e9277-c0a7-426c-af54-216da387f47d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:40:28 compute-0 nova_compute[183177]: 2026-01-26 19:40:28.661 183181 WARNING nova.compute.manager [req-ce577b59-d18e-48d8-80b7-eb5260b169d8 req-36f40cb3-f661-4e76-a549-b0ad894379c0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received unexpected event network-vif-plugged-060e9277-c0a7-426c-af54-216da387f47d for instance with vm_state resized and task_state None.
Jan 26 19:40:28 compute-0 nova_compute[183177]: 2026-01-26 19:40:28.830 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:28 compute-0 nova_compute[183177]: 2026-01-26 19:40:28.837 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:29 compute-0 podman[192499]: time="2026-01-26T19:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:40:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:40:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2632 "" "Go-http-client/1.1"
Jan 26 19:40:30 compute-0 podman[206396]: 2026-01-26 19:40:30.354378939 +0000 UTC m=+0.103784075 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 19:40:31 compute-0 openstack_network_exporter[195363]: ERROR   19:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:40:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:40:31 compute-0 openstack_network_exporter[195363]: ERROR   19:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:40:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:40:33 compute-0 nova_compute[183177]: 2026-01-26 19:40:33.835 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:33 compute-0 nova_compute[183177]: 2026-01-26 19:40:33.840 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:34.695 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:38 compute-0 nova_compute[183177]: 2026-01-26 19:40:38.178 183181 DEBUG oslo_concurrency.lockutils [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:38 compute-0 nova_compute[183177]: 2026-01-26 19:40:38.179 183181 DEBUG oslo_concurrency.lockutils [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:38 compute-0 nova_compute[183177]: 2026-01-26 19:40:38.180 183181 DEBUG nova.compute.manager [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Going to confirm migration 1 do_confirm_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:5287
Jan 26 19:40:38 compute-0 nova_compute[183177]: 2026-01-26 19:40:38.703 183181 DEBUG nova.objects.instance [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'info_cache' on Instance uuid 04802a55-668d-42ba-bc20-72c2e3f29298 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:40:38 compute-0 nova_compute[183177]: 2026-01-26 19:40:38.840 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:38 compute-0 nova_compute[183177]: 2026-01-26 19:40:38.841 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:39 compute-0 ovn_controller[95396]: 2026-01-26T19:40:39Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:09:71:c5 10.100.0.9
Jan 26 19:40:39 compute-0 nova_compute[183177]: 2026-01-26 19:40:39.228 183181 WARNING neutronclient.v2_0.client [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:39 compute-0 nova_compute[183177]: 2026-01-26 19:40:39.423 183181 WARNING neutronclient.v2_0.client [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:39 compute-0 nova_compute[183177]: 2026-01-26 19:40:39.424 183181 DEBUG oslo_concurrency.lockutils [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:40:39 compute-0 nova_compute[183177]: 2026-01-26 19:40:39.425 183181 DEBUG oslo_concurrency.lockutils [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:40:39 compute-0 nova_compute[183177]: 2026-01-26 19:40:39.425 183181 DEBUG nova.network.neutron [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:40:39 compute-0 nova_compute[183177]: 2026-01-26 19:40:39.932 183181 WARNING neutronclient.v2_0.client [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:40 compute-0 nova_compute[183177]: 2026-01-26 19:40:40.360 183181 WARNING neutronclient.v2_0.client [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:40 compute-0 nova_compute[183177]: 2026-01-26 19:40:40.681 183181 DEBUG nova.network.neutron [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Updating instance_info_cache with network_info: [{"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:40:41 compute-0 nova_compute[183177]: 2026-01-26 19:40:41.198 183181 DEBUG oslo_concurrency.lockutils [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-04802a55-668d-42ba-bc20-72c2e3f29298" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:40:41 compute-0 nova_compute[183177]: 2026-01-26 19:40:41.199 183181 DEBUG nova.objects.instance [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid 04802a55-668d-42ba-bc20-72c2e3f29298 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:40:41 compute-0 nova_compute[183177]: 2026-01-26 19:40:41.708 183181 DEBUG nova.objects.base [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Object Instance<04802a55-668d-42ba-bc20-72c2e3f29298> lazy-loaded attributes: info_cache,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 19:40:41 compute-0 nova_compute[183177]: 2026-01-26 19:40:41.711 183181 DEBUG oslo_concurrency.lockutils [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:41 compute-0 nova_compute[183177]: 2026-01-26 19:40:41.713 183181 DEBUG oslo_concurrency.lockutils [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:41 compute-0 nova_compute[183177]: 2026-01-26 19:40:41.822 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:41 compute-0 nova_compute[183177]: 2026-01-26 19:40:41.823 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:42 compute-0 nova_compute[183177]: 2026-01-26 19:40:42.329 183181 DEBUG nova.compute.manager [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 19:40:42 compute-0 nova_compute[183177]: 2026-01-26 19:40:42.335 183181 DEBUG nova.compute.provider_tree [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:40:42 compute-0 nova_compute[183177]: 2026-01-26 19:40:42.841 183181 DEBUG nova.scheduler.client.report [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:40:42 compute-0 nova_compute[183177]: 2026-01-26 19:40:42.889 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:43 compute-0 nova_compute[183177]: 2026-01-26 19:40:43.842 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:43 compute-0 nova_compute[183177]: 2026-01-26 19:40:43.844 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:43 compute-0 nova_compute[183177]: 2026-01-26 19:40:43.875 183181 DEBUG oslo_concurrency.lockutils [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 2.162s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:43 compute-0 nova_compute[183177]: 2026-01-26 19:40:43.879 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.990s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:43 compute-0 nova_compute[183177]: 2026-01-26 19:40:43.889 183181 DEBUG nova.virt.hardware [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 19:40:43 compute-0 nova_compute[183177]: 2026-01-26 19:40:43.890 183181 INFO nova.compute.claims [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Claim successful on node compute-0.ctlplane.example.com
Jan 26 19:40:44 compute-0 nova_compute[183177]: 2026-01-26 19:40:44.485 183181 INFO nova.scheduler.client.report [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration dd1eb126-ecbc-4427-9f3e-210191098854
Jan 26 19:40:44 compute-0 nova_compute[183177]: 2026-01-26 19:40:44.966 183181 DEBUG nova.compute.provider_tree [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:40:44 compute-0 nova_compute[183177]: 2026-01-26 19:40:44.996 183181 DEBUG oslo_concurrency.lockutils [None req-853ab01c-05c0-410b-830e-2a86f8c780f6 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 6.816s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:45 compute-0 nova_compute[183177]: 2026-01-26 19:40:45.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:40:45 compute-0 nova_compute[183177]: 2026-01-26 19:40:45.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:40:45 compute-0 nova_compute[183177]: 2026-01-26 19:40:45.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:40:45 compute-0 nova_compute[183177]: 2026-01-26 19:40:45.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:40:45 compute-0 nova_compute[183177]: 2026-01-26 19:40:45.476 183181 DEBUG nova.scheduler.client.report [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:40:45 compute-0 nova_compute[183177]: 2026-01-26 19:40:45.667 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:45 compute-0 nova_compute[183177]: 2026-01-26 19:40:45.987 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:45 compute-0 nova_compute[183177]: 2026-01-26 19:40:45.988 183181 DEBUG nova.compute.manager [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 19:40:45 compute-0 nova_compute[183177]: 2026-01-26 19:40:45.992 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.325s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:45 compute-0 nova_compute[183177]: 2026-01-26 19:40:45.993 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:45 compute-0 nova_compute[183177]: 2026-01-26 19:40:45.993 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:40:46 compute-0 nova_compute[183177]: 2026-01-26 19:40:46.504 183181 DEBUG nova.compute.manager [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 19:40:46 compute-0 nova_compute[183177]: 2026-01-26 19:40:46.505 183181 DEBUG nova.network.neutron [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 19:40:46 compute-0 nova_compute[183177]: 2026-01-26 19:40:46.506 183181 WARNING neutronclient.v2_0.client [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:46 compute-0 nova_compute[183177]: 2026-01-26 19:40:46.507 183181 WARNING neutronclient.v2_0.client [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:47 compute-0 nova_compute[183177]: 2026-01-26 19:40:47.017 183181 INFO nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 19:40:47 compute-0 nova_compute[183177]: 2026-01-26 19:40:47.062 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:47 compute-0 nova_compute[183177]: 2026-01-26 19:40:47.127 183181 DEBUG nova.network.neutron [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Successfully created port: 324164fa-164b-418a-ba25-b2508e836f80 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 19:40:47 compute-0 nova_compute[183177]: 2026-01-26 19:40:47.159 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:47 compute-0 nova_compute[183177]: 2026-01-26 19:40:47.160 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:47 compute-0 nova_compute[183177]: 2026-01-26 19:40:47.231 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:47 compute-0 nova_compute[183177]: 2026-01-26 19:40:47.419 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:40:47 compute-0 nova_compute[183177]: 2026-01-26 19:40:47.420 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:47 compute-0 nova_compute[183177]: 2026-01-26 19:40:47.440 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:47 compute-0 nova_compute[183177]: 2026-01-26 19:40:47.441 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5646MB free_disk=73.07579040527344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:40:47 compute-0 nova_compute[183177]: 2026-01-26 19:40:47.441 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:47 compute-0 nova_compute[183177]: 2026-01-26 19:40:47.442 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:47 compute-0 nova_compute[183177]: 2026-01-26 19:40:47.528 183181 DEBUG nova.compute.manager [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.386 183181 DEBUG nova.network.neutron [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Successfully updated port: 324164fa-164b-418a-ba25-b2508e836f80 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.461 183181 DEBUG nova.compute.manager [req-79d994e9-3621-407f-a0c5-8edc128bd26f req-6b57248b-1dba-4188-a848-19448ca868ed 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-changed-324164fa-164b-418a-ba25-b2508e836f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.462 183181 DEBUG nova.compute.manager [req-79d994e9-3621-407f-a0c5-8edc128bd26f req-6b57248b-1dba-4188-a848-19448ca868ed 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Refreshing instance network info cache due to event network-changed-324164fa-164b-418a-ba25-b2508e836f80. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.462 183181 DEBUG oslo_concurrency.lockutils [req-79d994e9-3621-407f-a0c5-8edc128bd26f req-6b57248b-1dba-4188-a848-19448ca868ed 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-e1a70852-daf7-45b0-849d-957892f3d109" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.463 183181 DEBUG oslo_concurrency.lockutils [req-79d994e9-3621-407f-a0c5-8edc128bd26f req-6b57248b-1dba-4188-a848-19448ca868ed 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-e1a70852-daf7-45b0-849d-957892f3d109" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.463 183181 DEBUG nova.network.neutron [req-79d994e9-3621-407f-a0c5-8edc128bd26f req-6b57248b-1dba-4188-a848-19448ca868ed 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Refreshing network info cache for port 324164fa-164b-418a-ba25-b2508e836f80 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.486 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 04802a55-668d-42ba-bc20-72c2e3f29298 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.487 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance e1a70852-daf7-45b0-849d-957892f3d109 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.487 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.487 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:40:47 up  1:05,  0 user,  load average: 0.31, 0.32, 0.46\n', 'num_instances': '2', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_577ae27ca8cf44549308a35c420ae86d': '2', 'io_workload': '1', 'num_vm_building': '1', 'num_task_networking': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.547 183181 DEBUG nova.compute.manager [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.549 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.550 183181 INFO nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Creating image(s)
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.551 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.551 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.552 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.553 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.559 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.566 183181 DEBUG oslo_concurrency.processutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.577 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.655 183181 DEBUG oslo_concurrency.processutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.656 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.657 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.657 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.661 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.661 183181 DEBUG oslo_concurrency.processutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.728 183181 DEBUG oslo_concurrency.processutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.729 183181 DEBUG oslo_concurrency.processutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.769 183181 DEBUG oslo_concurrency.processutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.770 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.770 183181 DEBUG oslo_concurrency.processutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.825 183181 DEBUG oslo_concurrency.processutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.826 183181 DEBUG nova.virt.disk.api [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Checking if we can resize image /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.827 183181 DEBUG oslo_concurrency.processutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.845 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.847 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.893 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "refresh_cache-e1a70852-daf7-45b0-849d-957892f3d109" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.919 183181 DEBUG oslo_concurrency.processutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.920 183181 DEBUG nova.virt.disk.api [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Cannot resize image /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.921 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.921 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Ensure instance console log exists: /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.921 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.922 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.922 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:48 compute-0 nova_compute[183177]: 2026-01-26 19:40:48.974 183181 WARNING neutronclient.v2_0.client [req-79d994e9-3621-407f-a0c5-8edc128bd26f req-6b57248b-1dba-4188-a848-19448ca868ed 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:49 compute-0 nova_compute[183177]: 2026-01-26 19:40:49.087 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:40:49 compute-0 nova_compute[183177]: 2026-01-26 19:40:49.377 183181 DEBUG nova.network.neutron [req-79d994e9-3621-407f-a0c5-8edc128bd26f req-6b57248b-1dba-4188-a848-19448ca868ed 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:40:49 compute-0 nova_compute[183177]: 2026-01-26 19:40:49.558 183181 DEBUG nova.network.neutron [req-79d994e9-3621-407f-a0c5-8edc128bd26f req-6b57248b-1dba-4188-a848-19448ca868ed 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:40:49 compute-0 nova_compute[183177]: 2026-01-26 19:40:49.599 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:40:49 compute-0 nova_compute[183177]: 2026-01-26 19:40:49.599 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.158s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:50 compute-0 nova_compute[183177]: 2026-01-26 19:40:50.066 183181 DEBUG oslo_concurrency.lockutils [req-79d994e9-3621-407f-a0c5-8edc128bd26f req-6b57248b-1dba-4188-a848-19448ca868ed 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-e1a70852-daf7-45b0-849d-957892f3d109" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:40:50 compute-0 nova_compute[183177]: 2026-01-26 19:40:50.067 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquired lock "refresh_cache-e1a70852-daf7-45b0-849d-957892f3d109" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:40:50 compute-0 nova_compute[183177]: 2026-01-26 19:40:50.067 183181 DEBUG nova.network.neutron [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:40:50 compute-0 podman[206453]: 2026-01-26 19:40:50.393906985 +0000 UTC m=+0.135156309 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:40:51 compute-0 nova_compute[183177]: 2026-01-26 19:40:51.361 183181 DEBUG nova.network.neutron [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:40:51 compute-0 nova_compute[183177]: 2026-01-26 19:40:51.597 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:40:51 compute-0 nova_compute[183177]: 2026-01-26 19:40:51.597 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:40:51 compute-0 nova_compute[183177]: 2026-01-26 19:40:51.598 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:40:51 compute-0 nova_compute[183177]: 2026-01-26 19:40:51.599 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:40:51 compute-0 nova_compute[183177]: 2026-01-26 19:40:51.599 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.185 183181 WARNING neutronclient.v2_0.client [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.400 183181 DEBUG nova.network.neutron [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Updating instance_info_cache with network_info: [{"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.908 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Releasing lock "refresh_cache-e1a70852-daf7-45b0-849d-957892f3d109" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.909 183181 DEBUG nova.compute.manager [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Instance network_info: |[{"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.913 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Start _get_guest_xml network_info=[{"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.920 183181 WARNING nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.923 183181 DEBUG nova.virt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-51219326', uuid='e1a70852-daf7-45b0-849d-957892f3d109'), owner=OwnerMeta(userid='ee1e0029a6ac4b56b09c165dc3cd4dda', username='tempest-TestExecuteActionsViaActuator-1232791976-project-admin', projectid='577ae27ca8cf44549308a35c420ae86d', projectname='tempest-TestExecuteActionsViaActuator-1232791976'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769456452.9228485) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.929 183181 DEBUG nova.virt.libvirt.host [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.930 183181 DEBUG nova.virt.libvirt.host [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.934 183181 DEBUG nova.virt.libvirt.host [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.935 183181 DEBUG nova.virt.libvirt.host [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.937 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.938 183181 DEBUG nova.virt.hardware [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.939 183181 DEBUG nova.virt.hardware [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.939 183181 DEBUG nova.virt.hardware [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.940 183181 DEBUG nova.virt.hardware [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.940 183181 DEBUG nova.virt.hardware [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.941 183181 DEBUG nova.virt.hardware [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.941 183181 DEBUG nova.virt.hardware [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.942 183181 DEBUG nova.virt.hardware [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.942 183181 DEBUG nova.virt.hardware [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.943 183181 DEBUG nova.virt.hardware [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.943 183181 DEBUG nova.virt.hardware [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.951 183181 DEBUG nova.virt.libvirt.vif [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:40:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-51219326',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-51219326',id=6,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-l1d7bs6v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:40:47Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=e1a70852-daf7-45b0-849d-957892f3d109,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.952 183181 DEBUG nova.network.os_vif_util [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converting VIF {"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.954 183181 DEBUG nova.network.os_vif_util [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:5d,bridge_name='br-int',has_traffic_filtering=True,id=324164fa-164b-418a-ba25-b2508e836f80,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap324164fa-16') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:40:52 compute-0 nova_compute[183177]: 2026-01-26 19:40:52.956 183181 DEBUG nova.objects.instance [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lazy-loading 'pci_devices' on Instance uuid e1a70852-daf7-45b0-849d-957892f3d109 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:40:53 compute-0 podman[206479]: 2026-01-26 19:40:53.323617299 +0000 UTC m=+0.063544898 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal)
Jan 26 19:40:53 compute-0 podman[206480]: 2026-01-26 19:40:53.34765627 +0000 UTC m=+0.077415633 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.470 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] End _get_guest_xml xml=<domain type="kvm">
Jan 26 19:40:53 compute-0 nova_compute[183177]:   <uuid>e1a70852-daf7-45b0-849d-957892f3d109</uuid>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   <name>instance-00000006</name>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-51219326</nova:name>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:40:52</nova:creationTime>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:40:53 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:40:53 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:user uuid="ee1e0029a6ac4b56b09c165dc3cd4dda">tempest-TestExecuteActionsViaActuator-1232791976-project-admin</nova:user>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:project uuid="577ae27ca8cf44549308a35c420ae86d">tempest-TestExecuteActionsViaActuator-1232791976</nova:project>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         <nova:port uuid="324164fa-164b-418a-ba25-b2508e836f80">
Jan 26 19:40:53 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <system>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <entry name="serial">e1a70852-daf7-45b0-849d-957892f3d109</entry>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <entry name="uuid">e1a70852-daf7-45b0-849d-957892f3d109</entry>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     </system>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   <os>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   </os>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   <features>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   </features>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk.config"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:b4:a5:5d"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <target dev="tap324164fa-16"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/console.log" append="off"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <video>
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     </video>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:40:53 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:40:53 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:40:53 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:40:53 compute-0 nova_compute[183177]: </domain>
Jan 26 19:40:53 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.471 183181 DEBUG nova.compute.manager [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Preparing to wait for external event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.471 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.471 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.472 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.472 183181 DEBUG nova.virt.libvirt.vif [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:40:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-51219326',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-51219326',id=6,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-l1d7bs6v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:40:47Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=e1a70852-daf7-45b0-849d-957892f3d109,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.473 183181 DEBUG nova.network.os_vif_util [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converting VIF {"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.473 183181 DEBUG nova.network.os_vif_util [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:5d,bridge_name='br-int',has_traffic_filtering=True,id=324164fa-164b-418a-ba25-b2508e836f80,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap324164fa-16') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.473 183181 DEBUG os_vif [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:5d,bridge_name='br-int',has_traffic_filtering=True,id=324164fa-164b-418a-ba25-b2508e836f80,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap324164fa-16') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.474 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.474 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.475 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.475 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.475 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'abe058e5-3897-5522-a278-2e2f5aad952d', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.504 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.506 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.506 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.511 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.511 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap324164fa-16, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.512 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap324164fa-16, col_values=(('qos', UUID('4668179b-9f24-4134-86de-fb08dea07ce0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.512 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap324164fa-16, col_values=(('external_ids', {'iface-id': '324164fa-164b-418a-ba25-b2508e836f80', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:a5:5d', 'vm-uuid': 'e1a70852-daf7-45b0-849d-957892f3d109'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.514 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:53 compute-0 NetworkManager[55489]: <info>  [1769456453.5159] manager: (tap324164fa-16): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.516 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.525 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.526 183181 INFO os_vif [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:5d,bridge_name='br-int',has_traffic_filtering=True,id=324164fa-164b-418a-ba25-b2508e836f80,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap324164fa-16')
Jan 26 19:40:53 compute-0 nova_compute[183177]: 2026-01-26 19:40:53.850 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:55 compute-0 nova_compute[183177]: 2026-01-26 19:40:55.079 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:40:55 compute-0 nova_compute[183177]: 2026-01-26 19:40:55.079 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:40:55 compute-0 nova_compute[183177]: 2026-01-26 19:40:55.079 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] No VIF found with MAC fa:16:3e:b4:a5:5d, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 19:40:55 compute-0 nova_compute[183177]: 2026-01-26 19:40:55.080 183181 INFO nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Using config drive
Jan 26 19:40:55 compute-0 nova_compute[183177]: 2026-01-26 19:40:55.592 183181 WARNING neutronclient.v2_0.client [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:40:56 compute-0 nova_compute[183177]: 2026-01-26 19:40:56.439 183181 INFO nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Creating config drive at /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk.config
Jan 26 19:40:56 compute-0 nova_compute[183177]: 2026-01-26 19:40:56.444 183181 DEBUG oslo_concurrency.processutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp48hxln_x execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:40:56 compute-0 nova_compute[183177]: 2026-01-26 19:40:56.586 183181 DEBUG oslo_concurrency.processutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp48hxln_x" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:40:56 compute-0 kernel: tap324164fa-16: entered promiscuous mode
Jan 26 19:40:56 compute-0 NetworkManager[55489]: <info>  [1769456456.6632] manager: (tap324164fa-16): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Jan 26 19:40:56 compute-0 nova_compute[183177]: 2026-01-26 19:40:56.665 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:56 compute-0 ovn_controller[95396]: 2026-01-26T19:40:56Z|00071|binding|INFO|Claiming lport 324164fa-164b-418a-ba25-b2508e836f80 for this chassis.
Jan 26 19:40:56 compute-0 ovn_controller[95396]: 2026-01-26T19:40:56Z|00072|binding|INFO|324164fa-164b-418a-ba25-b2508e836f80: Claiming fa:16:3e:b4:a5:5d 10.100.0.8
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.673 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:a5:5d 10.100.0.8'], port_security=['fa:16:3e:b4:a5:5d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e1a70852-daf7-45b0-849d-957892f3d109', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02893814-74cb-419e-9539-9a1c8c79b4be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '577ae27ca8cf44549308a35c420ae86d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e6bd7c8f-1625-4087-a64f-b17674d29af3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=541eb2a4-ee15-4d4c-b84c-5746be40fade, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=324164fa-164b-418a-ba25-b2508e836f80) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.675 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 324164fa-164b-418a-ba25-b2508e836f80 in datapath 02893814-74cb-419e-9539-9a1c8c79b4be bound to our chassis
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.677 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 02893814-74cb-419e-9539-9a1c8c79b4be
Jan 26 19:40:56 compute-0 ovn_controller[95396]: 2026-01-26T19:40:56Z|00073|binding|INFO|Setting lport 324164fa-164b-418a-ba25-b2508e836f80 ovn-installed in OVS
Jan 26 19:40:56 compute-0 ovn_controller[95396]: 2026-01-26T19:40:56Z|00074|binding|INFO|Setting lport 324164fa-164b-418a-ba25-b2508e836f80 up in Southbound
Jan 26 19:40:56 compute-0 nova_compute[183177]: 2026-01-26 19:40:56.689 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.699 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e8fca6dd-53b8-4b25-bf85-9f2b2d3b630e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:56 compute-0 systemd-udevd[206539]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:40:56 compute-0 systemd-machined[154465]: New machine qemu-5-instance-00000006.
Jan 26 19:40:56 compute-0 NetworkManager[55489]: <info>  [1769456456.7272] device (tap324164fa-16): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:40:56 compute-0 NetworkManager[55489]: <info>  [1769456456.7280] device (tap324164fa-16): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.734 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[c243f13c-31dc-4629-8585-242f59431a9e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:56 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000006.
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.737 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[602b0fd0-3fef-47a5-b39c-a4e82adee122]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.771 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[067d067a-158d-4f66-afb2-d38a8e77de31]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.801 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[69f2fdd8-72c6-4879-8ead-e5a9c259a645]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02893814-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:b9:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388467, 'reachable_time': 43675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206546, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.823 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8a36c871-9f2c-414e-b173-8141d4295463]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap02893814-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388481, 'tstamp': 388481}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206552, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap02893814-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388484, 'tstamp': 388484}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206552, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.825 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02893814-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:56 compute-0 nova_compute[183177]: 2026-01-26 19:40:56.826 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:56 compute-0 nova_compute[183177]: 2026-01-26 19:40:56.828 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.829 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02893814-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.829 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.829 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap02893814-70, col_values=(('external_ids', {'iface-id': '8041d9b1-f920-4d7e-a0aa-621f944e098e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.829 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:40:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:40:56.831 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[304a6ccc-3209-401e-993d-db9103a724a2]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-02893814-74cb-419e-9539-9a1c8c79b4be\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 02893814-74cb-419e-9539-9a1c8c79b4be\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:40:56 compute-0 nova_compute[183177]: 2026-01-26 19:40:56.892 183181 DEBUG nova.compute.manager [req-44eb0a4c-7907-4eb8-a792-b60035f7b406 req-22853366-80ff-4d8f-9f53-68859a950166 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:40:56 compute-0 nova_compute[183177]: 2026-01-26 19:40:56.892 183181 DEBUG oslo_concurrency.lockutils [req-44eb0a4c-7907-4eb8-a792-b60035f7b406 req-22853366-80ff-4d8f-9f53-68859a950166 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:56 compute-0 nova_compute[183177]: 2026-01-26 19:40:56.892 183181 DEBUG oslo_concurrency.lockutils [req-44eb0a4c-7907-4eb8-a792-b60035f7b406 req-22853366-80ff-4d8f-9f53-68859a950166 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:56 compute-0 nova_compute[183177]: 2026-01-26 19:40:56.893 183181 DEBUG oslo_concurrency.lockutils [req-44eb0a4c-7907-4eb8-a792-b60035f7b406 req-22853366-80ff-4d8f-9f53-68859a950166 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:56 compute-0 nova_compute[183177]: 2026-01-26 19:40:56.893 183181 DEBUG nova.compute.manager [req-44eb0a4c-7907-4eb8-a792-b60035f7b406 req-22853366-80ff-4d8f-9f53-68859a950166 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Processing event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:40:57 compute-0 nova_compute[183177]: 2026-01-26 19:40:57.017 183181 DEBUG nova.compute.manager [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:40:57 compute-0 nova_compute[183177]: 2026-01-26 19:40:57.020 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 19:40:57 compute-0 nova_compute[183177]: 2026-01-26 19:40:57.024 183181 INFO nova.virt.libvirt.driver [-] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Instance spawned successfully.
Jan 26 19:40:57 compute-0 nova_compute[183177]: 2026-01-26 19:40:57.025 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 19:40:57 compute-0 nova_compute[183177]: 2026-01-26 19:40:57.541 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:40:57 compute-0 nova_compute[183177]: 2026-01-26 19:40:57.543 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:40:57 compute-0 nova_compute[183177]: 2026-01-26 19:40:57.544 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:40:57 compute-0 nova_compute[183177]: 2026-01-26 19:40:57.545 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:40:57 compute-0 nova_compute[183177]: 2026-01-26 19:40:57.546 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:40:57 compute-0 nova_compute[183177]: 2026-01-26 19:40:57.547 183181 DEBUG nova.virt.libvirt.driver [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:40:58 compute-0 nova_compute[183177]: 2026-01-26 19:40:58.064 183181 INFO nova.compute.manager [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Took 9.52 seconds to spawn the instance on the hypervisor.
Jan 26 19:40:58 compute-0 nova_compute[183177]: 2026-01-26 19:40:58.065 183181 DEBUG nova.compute.manager [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 19:40:58 compute-0 nova_compute[183177]: 2026-01-26 19:40:58.516 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:58 compute-0 nova_compute[183177]: 2026-01-26 19:40:58.594 183181 INFO nova.compute.manager [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Took 15.76 seconds to build instance.
Jan 26 19:40:58 compute-0 nova_compute[183177]: 2026-01-26 19:40:58.852 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:40:58 compute-0 nova_compute[183177]: 2026-01-26 19:40:58.965 183181 DEBUG nova.compute.manager [req-d9b19a37-2079-44a0-a3b0-5bd2fc9f4fa6 req-e0a22e5c-f1ec-4533-9cb0-e2f183641300 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:40:58 compute-0 nova_compute[183177]: 2026-01-26 19:40:58.965 183181 DEBUG oslo_concurrency.lockutils [req-d9b19a37-2079-44a0-a3b0-5bd2fc9f4fa6 req-e0a22e5c-f1ec-4533-9cb0-e2f183641300 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:40:58 compute-0 nova_compute[183177]: 2026-01-26 19:40:58.966 183181 DEBUG oslo_concurrency.lockutils [req-d9b19a37-2079-44a0-a3b0-5bd2fc9f4fa6 req-e0a22e5c-f1ec-4533-9cb0-e2f183641300 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:40:58 compute-0 nova_compute[183177]: 2026-01-26 19:40:58.966 183181 DEBUG oslo_concurrency.lockutils [req-d9b19a37-2079-44a0-a3b0-5bd2fc9f4fa6 req-e0a22e5c-f1ec-4533-9cb0-e2f183641300 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:58 compute-0 nova_compute[183177]: 2026-01-26 19:40:58.966 183181 DEBUG nova.compute.manager [req-d9b19a37-2079-44a0-a3b0-5bd2fc9f4fa6 req-e0a22e5c-f1ec-4533-9cb0-e2f183641300 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] No waiting events found dispatching network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:40:58 compute-0 nova_compute[183177]: 2026-01-26 19:40:58.966 183181 WARNING nova.compute.manager [req-d9b19a37-2079-44a0-a3b0-5bd2fc9f4fa6 req-e0a22e5c-f1ec-4533-9cb0-e2f183641300 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received unexpected event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 for instance with vm_state active and task_state None.
Jan 26 19:40:59 compute-0 nova_compute[183177]: 2026-01-26 19:40:59.100 183181 DEBUG oslo_concurrency.lockutils [None req-14aa86f5-f47f-44f6-a86c-1c5bff45a9e3 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.277s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:40:59 compute-0 podman[192499]: time="2026-01-26T19:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:40:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:40:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Jan 26 19:41:01 compute-0 podman[206561]: 2026-01-26 19:41:01.353423114 +0000 UTC m=+0.096376150 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:41:01 compute-0 openstack_network_exporter[195363]: ERROR   19:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:41:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:41:01 compute-0 openstack_network_exporter[195363]: ERROR   19:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:41:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:41:03 compute-0 nova_compute[183177]: 2026-01-26 19:41:03.520 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:03 compute-0 nova_compute[183177]: 2026-01-26 19:41:03.855 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:08 compute-0 nova_compute[183177]: 2026-01-26 19:41:08.524 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:08 compute-0 nova_compute[183177]: 2026-01-26 19:41:08.857 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:10 compute-0 ovn_controller[95396]: 2026-01-26T19:41:10Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b4:a5:5d 10.100.0.8
Jan 26 19:41:10 compute-0 ovn_controller[95396]: 2026-01-26T19:41:10Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b4:a5:5d 10.100.0.8
Jan 26 19:41:11 compute-0 sshd-session[206598]: Invalid user Admin from 193.32.162.151 port 49536
Jan 26 19:41:11 compute-0 sshd-session[206598]: Connection closed by invalid user Admin 193.32.162.151 port 49536 [preauth]
Jan 26 19:41:13 compute-0 nova_compute[183177]: 2026-01-26 19:41:13.563 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:13 compute-0 nova_compute[183177]: 2026-01-26 19:41:13.860 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:18 compute-0 nova_compute[183177]: 2026-01-26 19:41:18.567 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:18 compute-0 nova_compute[183177]: 2026-01-26 19:41:18.863 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:21 compute-0 podman[206601]: 2026-01-26 19:41:21.392432208 +0000 UTC m=+0.122594837 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Jan 26 19:41:23 compute-0 nova_compute[183177]: 2026-01-26 19:41:23.620 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:23 compute-0 nova_compute[183177]: 2026-01-26 19:41:23.866 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:24.037 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:41:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:24.037 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:41:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:24.038 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:41:24 compute-0 podman[206630]: 2026-01-26 19:41:24.328064258 +0000 UTC m=+0.068625722 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0)
Jan 26 19:41:24 compute-0 podman[206629]: 2026-01-26 19:41:24.328052078 +0000 UTC m=+0.074060525 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Jan 26 19:41:28 compute-0 nova_compute[183177]: 2026-01-26 19:41:28.655 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:28 compute-0 nova_compute[183177]: 2026-01-26 19:41:28.868 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:29 compute-0 podman[192499]: time="2026-01-26T19:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:41:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:41:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2632 "" "Go-http-client/1.1"
Jan 26 19:41:31 compute-0 nova_compute[183177]: 2026-01-26 19:41:31.175 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:41:31 compute-0 nova_compute[183177]: 2026-01-26 19:41:31.176 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:41:31 compute-0 openstack_network_exporter[195363]: ERROR   19:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:41:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:41:31 compute-0 openstack_network_exporter[195363]: ERROR   19:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:41:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:41:31 compute-0 nova_compute[183177]: 2026-01-26 19:41:31.682 183181 DEBUG nova.compute.manager [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 19:41:32 compute-0 nova_compute[183177]: 2026-01-26 19:41:32.287 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:41:32 compute-0 nova_compute[183177]: 2026-01-26 19:41:32.289 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:41:32 compute-0 nova_compute[183177]: 2026-01-26 19:41:32.301 183181 DEBUG nova.virt.hardware [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 19:41:32 compute-0 nova_compute[183177]: 2026-01-26 19:41:32.301 183181 INFO nova.compute.claims [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Claim successful on node compute-0.ctlplane.example.com
Jan 26 19:41:32 compute-0 podman[206668]: 2026-01-26 19:41:32.332456855 +0000 UTC m=+0.072374857 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:41:33 compute-0 nova_compute[183177]: 2026-01-26 19:41:33.341 183181 DEBUG nova.scheduler.client.report [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Refreshing inventories for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 19:41:33 compute-0 nova_compute[183177]: 2026-01-26 19:41:33.358 183181 DEBUG nova.scheduler.client.report [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Updating ProviderTree inventory for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 19:41:33 compute-0 nova_compute[183177]: 2026-01-26 19:41:33.359 183181 DEBUG nova.compute.provider_tree [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 19:41:33 compute-0 nova_compute[183177]: 2026-01-26 19:41:33.373 183181 DEBUG nova.scheduler.client.report [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Refreshing aggregate associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 19:41:33 compute-0 nova_compute[183177]: 2026-01-26 19:41:33.399 183181 DEBUG nova.scheduler.client.report [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Refreshing trait associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_SCSI,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 19:41:33 compute-0 nova_compute[183177]: 2026-01-26 19:41:33.486 183181 DEBUG nova.compute.provider_tree [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:41:33 compute-0 nova_compute[183177]: 2026-01-26 19:41:33.659 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:33 compute-0 nova_compute[183177]: 2026-01-26 19:41:33.870 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:33 compute-0 nova_compute[183177]: 2026-01-26 19:41:33.993 183181 DEBUG nova.scheduler.client.report [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:41:34 compute-0 nova_compute[183177]: 2026-01-26 19:41:34.512 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.222s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:41:34 compute-0 nova_compute[183177]: 2026-01-26 19:41:34.512 183181 DEBUG nova.compute.manager [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 19:41:35 compute-0 nova_compute[183177]: 2026-01-26 19:41:35.029 183181 DEBUG nova.compute.manager [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 19:41:35 compute-0 nova_compute[183177]: 2026-01-26 19:41:35.029 183181 DEBUG nova.network.neutron [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 19:41:35 compute-0 nova_compute[183177]: 2026-01-26 19:41:35.030 183181 WARNING neutronclient.v2_0.client [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:41:35 compute-0 nova_compute[183177]: 2026-01-26 19:41:35.031 183181 WARNING neutronclient.v2_0.client [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:41:35 compute-0 nova_compute[183177]: 2026-01-26 19:41:35.540 183181 INFO nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 19:41:36 compute-0 nova_compute[183177]: 2026-01-26 19:41:36.050 183181 DEBUG nova.compute.manager [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 19:41:36 compute-0 nova_compute[183177]: 2026-01-26 19:41:36.484 183181 DEBUG nova.network.neutron [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Successfully created port: 565fd712-31bf-40ce-ae6d-d74e1bfee02b _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.072 183181 DEBUG nova.compute.manager [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.075 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.075 183181 INFO nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Creating image(s)
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.076 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.077 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.078 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.079 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.086 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.089 183181 DEBUG oslo_concurrency.processutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.182 183181 DEBUG oslo_concurrency.processutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.184 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.184 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.185 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.189 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.190 183181 DEBUG oslo_concurrency.processutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.272 183181 DEBUG oslo_concurrency.processutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.273 183181 DEBUG oslo_concurrency.processutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.340 183181 DEBUG oslo_concurrency.processutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk 1073741824" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.342 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.157s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.342 183181 DEBUG oslo_concurrency.processutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.405 183181 DEBUG oslo_concurrency.processutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.406 183181 DEBUG nova.virt.disk.api [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Checking if we can resize image /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.407 183181 DEBUG oslo_concurrency.processutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.463 183181 DEBUG oslo_concurrency.processutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.463 183181 DEBUG nova.virt.disk.api [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Cannot resize image /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.464 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.464 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Ensure instance console log exists: /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.465 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.465 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.465 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.628 183181 DEBUG nova.network.neutron [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Successfully updated port: 565fd712-31bf-40ce-ae6d-d74e1bfee02b _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.727 183181 DEBUG nova.compute.manager [req-ea31f93b-06e3-494d-a0a4-fafec1d54703 req-475df492-b90d-4ac6-8cd1-c7874a359bd2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Received event network-changed-565fd712-31bf-40ce-ae6d-d74e1bfee02b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.728 183181 DEBUG nova.compute.manager [req-ea31f93b-06e3-494d-a0a4-fafec1d54703 req-475df492-b90d-4ac6-8cd1-c7874a359bd2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Refreshing instance network info cache due to event network-changed-565fd712-31bf-40ce-ae6d-d74e1bfee02b. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.728 183181 DEBUG oslo_concurrency.lockutils [req-ea31f93b-06e3-494d-a0a4-fafec1d54703 req-475df492-b90d-4ac6-8cd1-c7874a359bd2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.728 183181 DEBUG oslo_concurrency.lockutils [req-ea31f93b-06e3-494d-a0a4-fafec1d54703 req-475df492-b90d-4ac6-8cd1-c7874a359bd2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:41:37 compute-0 nova_compute[183177]: 2026-01-26 19:41:37.729 183181 DEBUG nova.network.neutron [req-ea31f93b-06e3-494d-a0a4-fafec1d54703 req-475df492-b90d-4ac6-8cd1-c7874a359bd2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Refreshing network info cache for port 565fd712-31bf-40ce-ae6d-d74e1bfee02b _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:41:38 compute-0 nova_compute[183177]: 2026-01-26 19:41:38.135 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:41:38 compute-0 nova_compute[183177]: 2026-01-26 19:41:38.235 183181 WARNING neutronclient.v2_0.client [req-ea31f93b-06e3-494d-a0a4-fafec1d54703 req-475df492-b90d-4ac6-8cd1-c7874a359bd2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:41:38 compute-0 nova_compute[183177]: 2026-01-26 19:41:38.366 183181 DEBUG nova.network.neutron [req-ea31f93b-06e3-494d-a0a4-fafec1d54703 req-475df492-b90d-4ac6-8cd1-c7874a359bd2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:41:38 compute-0 nova_compute[183177]: 2026-01-26 19:41:38.593 183181 DEBUG nova.network.neutron [req-ea31f93b-06e3-494d-a0a4-fafec1d54703 req-475df492-b90d-4ac6-8cd1-c7874a359bd2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:41:38 compute-0 nova_compute[183177]: 2026-01-26 19:41:38.664 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:38 compute-0 nova_compute[183177]: 2026-01-26 19:41:38.871 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:39 compute-0 nova_compute[183177]: 2026-01-26 19:41:39.101 183181 DEBUG oslo_concurrency.lockutils [req-ea31f93b-06e3-494d-a0a4-fafec1d54703 req-475df492-b90d-4ac6-8cd1-c7874a359bd2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:41:39 compute-0 nova_compute[183177]: 2026-01-26 19:41:39.102 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquired lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:41:39 compute-0 nova_compute[183177]: 2026-01-26 19:41:39.102 183181 DEBUG nova.network.neutron [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.030 183181 DEBUG nova.network.neutron [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.254 183181 WARNING neutronclient.v2_0.client [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.383 183181 DEBUG nova.network.neutron [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Updating instance_info_cache with network_info: [{"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.892 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Releasing lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.893 183181 DEBUG nova.compute.manager [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Instance network_info: |[{"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.894 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Start _get_guest_xml network_info=[{"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.898 183181 WARNING nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.899 183181 DEBUG nova.virt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-266020774', uuid='e94ea7ed-8444-4fef-a770-3fbf69f0f3dd'), owner=OwnerMeta(userid='ee1e0029a6ac4b56b09c165dc3cd4dda', username='tempest-TestExecuteActionsViaActuator-1232791976-project-admin', projectid='577ae27ca8cf44549308a35c420ae86d', projectname='tempest-TestExecuteActionsViaActuator-1232791976'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769456500.8992944) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.903 183181 DEBUG nova.virt.libvirt.host [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.903 183181 DEBUG nova.virt.libvirt.host [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.906 183181 DEBUG nova.virt.libvirt.host [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.907 183181 DEBUG nova.virt.libvirt.host [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.909 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.909 183181 DEBUG nova.virt.hardware [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.910 183181 DEBUG nova.virt.hardware [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.910 183181 DEBUG nova.virt.hardware [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.910 183181 DEBUG nova.virt.hardware [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.910 183181 DEBUG nova.virt.hardware [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.911 183181 DEBUG nova.virt.hardware [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.911 183181 DEBUG nova.virt.hardware [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.911 183181 DEBUG nova.virt.hardware [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.912 183181 DEBUG nova.virt.hardware [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.912 183181 DEBUG nova.virt.hardware [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.912 183181 DEBUG nova.virt.hardware [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.917 183181 DEBUG nova.virt.libvirt.vif [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:41:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-266020774',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-266020774',id=8,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-96b5d3r0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:41:36Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=e94ea7ed-8444-4fef-a770-3fbf69f0f3dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.917 183181 DEBUG nova.network.os_vif_util [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converting VIF {"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.918 183181 DEBUG nova.network.os_vif_util [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:3a:33,bridge_name='br-int',has_traffic_filtering=True,id=565fd712-31bf-40ce-ae6d-d74e1bfee02b,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap565fd712-31') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:41:40 compute-0 nova_compute[183177]: 2026-01-26 19:41:40.919 183181 DEBUG nova.objects.instance [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lazy-loading 'pci_devices' on Instance uuid e94ea7ed-8444-4fef-a770-3fbf69f0f3dd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.426 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] End _get_guest_xml xml=<domain type="kvm">
Jan 26 19:41:41 compute-0 nova_compute[183177]:   <uuid>e94ea7ed-8444-4fef-a770-3fbf69f0f3dd</uuid>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   <name>instance-00000008</name>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-266020774</nova:name>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:41:40</nova:creationTime>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:41:41 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:41:41 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:user uuid="ee1e0029a6ac4b56b09c165dc3cd4dda">tempest-TestExecuteActionsViaActuator-1232791976-project-admin</nova:user>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:project uuid="577ae27ca8cf44549308a35c420ae86d">tempest-TestExecuteActionsViaActuator-1232791976</nova:project>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         <nova:port uuid="565fd712-31bf-40ce-ae6d-d74e1bfee02b">
Jan 26 19:41:41 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <system>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <entry name="serial">e94ea7ed-8444-4fef-a770-3fbf69f0f3dd</entry>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <entry name="uuid">e94ea7ed-8444-4fef-a770-3fbf69f0f3dd</entry>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     </system>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   <os>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   </os>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   <features>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   </features>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk.config"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:58:3a:33"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <target dev="tap565fd712-31"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/console.log" append="off"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <video>
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     </video>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:41:41 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:41:41 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:41:41 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:41:41 compute-0 nova_compute[183177]: </domain>
Jan 26 19:41:41 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.427 183181 DEBUG nova.compute.manager [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Preparing to wait for external event network-vif-plugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.428 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.428 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.428 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.430 183181 DEBUG nova.virt.libvirt.vif [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:41:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-266020774',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-266020774',id=8,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-96b5d3r0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:41:36Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=e94ea7ed-8444-4fef-a770-3fbf69f0f3dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.430 183181 DEBUG nova.network.os_vif_util [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converting VIF {"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.431 183181 DEBUG nova.network.os_vif_util [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:3a:33,bridge_name='br-int',has_traffic_filtering=True,id=565fd712-31bf-40ce-ae6d-d74e1bfee02b,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap565fd712-31') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.432 183181 DEBUG os_vif [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:3a:33,bridge_name='br-int',has_traffic_filtering=True,id=565fd712-31bf-40ce-ae6d-d74e1bfee02b,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap565fd712-31') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.433 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.434 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.435 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.436 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.436 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '76b4d1bb-7643-522e-9500-1124c4805a13', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.437 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.439 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.439 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.442 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.442 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap565fd712-31, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.442 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap565fd712-31, col_values=(('qos', UUID('25242eb0-f2b9-4de8-842c-34cb7826d472')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.443 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap565fd712-31, col_values=(('external_ids', {'iface-id': '565fd712-31bf-40ce-ae6d-d74e1bfee02b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:3a:33', 'vm-uuid': 'e94ea7ed-8444-4fef-a770-3fbf69f0f3dd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.444 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:41 compute-0 NetworkManager[55489]: <info>  [1769456501.4459] manager: (tap565fd712-31): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.446 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.455 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:41 compute-0 nova_compute[183177]: 2026-01-26 19:41:41.455 183181 INFO os_vif [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:3a:33,bridge_name='br-int',has_traffic_filtering=True,id=565fd712-31bf-40ce-ae6d-d74e1bfee02b,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap565fd712-31')
Jan 26 19:41:43 compute-0 nova_compute[183177]: 2026-01-26 19:41:43.008 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:41:43 compute-0 nova_compute[183177]: 2026-01-26 19:41:43.009 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:41:43 compute-0 nova_compute[183177]: 2026-01-26 19:41:43.009 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] No VIF found with MAC fa:16:3e:58:3a:33, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 19:41:43 compute-0 nova_compute[183177]: 2026-01-26 19:41:43.009 183181 INFO nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Using config drive
Jan 26 19:41:43 compute-0 nova_compute[183177]: 2026-01-26 19:41:43.521 183181 WARNING neutronclient.v2_0.client [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:41:43 compute-0 nova_compute[183177]: 2026-01-26 19:41:43.875 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:44 compute-0 nova_compute[183177]: 2026-01-26 19:41:44.487 183181 INFO nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Creating config drive at /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk.config
Jan 26 19:41:44 compute-0 nova_compute[183177]: 2026-01-26 19:41:44.498 183181 DEBUG oslo_concurrency.processutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpnfh81cwy execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:41:44 compute-0 nova_compute[183177]: 2026-01-26 19:41:44.638 183181 DEBUG oslo_concurrency.processutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpnfh81cwy" returned: 0 in 0.140s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:41:44 compute-0 kernel: tap565fd712-31: entered promiscuous mode
Jan 26 19:41:44 compute-0 NetworkManager[55489]: <info>  [1769456504.7289] manager: (tap565fd712-31): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Jan 26 19:41:44 compute-0 systemd-udevd[206738]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:41:44 compute-0 nova_compute[183177]: 2026-01-26 19:41:44.774 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:44 compute-0 ovn_controller[95396]: 2026-01-26T19:41:44Z|00075|binding|INFO|Claiming lport 565fd712-31bf-40ce-ae6d-d74e1bfee02b for this chassis.
Jan 26 19:41:44 compute-0 ovn_controller[95396]: 2026-01-26T19:41:44Z|00076|binding|INFO|565fd712-31bf-40ce-ae6d-d74e1bfee02b: Claiming fa:16:3e:58:3a:33 10.100.0.5
Jan 26 19:41:44 compute-0 ovn_controller[95396]: 2026-01-26T19:41:44Z|00077|binding|INFO|Setting lport 565fd712-31bf-40ce-ae6d-d74e1bfee02b ovn-installed in OVS
Jan 26 19:41:44 compute-0 nova_compute[183177]: 2026-01-26 19:41:44.789 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:44 compute-0 nova_compute[183177]: 2026-01-26 19:41:44.791 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:44 compute-0 NetworkManager[55489]: <info>  [1769456504.7939] device (tap565fd712-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:41:44 compute-0 NetworkManager[55489]: <info>  [1769456504.7952] device (tap565fd712-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:41:44 compute-0 systemd-machined[154465]: New machine qemu-6-instance-00000008.
Jan 26 19:41:44 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000008.
Jan 26 19:41:44 compute-0 ovn_controller[95396]: 2026-01-26T19:41:44Z|00078|binding|INFO|Setting lport 565fd712-31bf-40ce-ae6d-d74e1bfee02b up in Southbound
Jan 26 19:41:44 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:44.877 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:3a:33 10.100.0.5'], port_security=['fa:16:3e:58:3a:33 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e94ea7ed-8444-4fef-a770-3fbf69f0f3dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02893814-74cb-419e-9539-9a1c8c79b4be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '577ae27ca8cf44549308a35c420ae86d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e6bd7c8f-1625-4087-a64f-b17674d29af3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=541eb2a4-ee15-4d4c-b84c-5746be40fade, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=565fd712-31bf-40ce-ae6d-d74e1bfee02b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:41:44 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:44.879 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 565fd712-31bf-40ce-ae6d-d74e1bfee02b in datapath 02893814-74cb-419e-9539-9a1c8c79b4be bound to our chassis
Jan 26 19:41:44 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:44.881 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 02893814-74cb-419e-9539-9a1c8c79b4be
Jan 26 19:41:44 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:44.903 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[069755e7-07fa-4b24-8387-3af170f2e5fb]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:41:44 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:44.949 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[69efea60-b311-400d-af5d-c957256b9126]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:41:44 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:44.952 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[cf86838b-1235-4d8b-8225-e7cdf3f29c8e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:41:44 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:44.996 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[9107b213-2514-4b32-a4c5-ac6900fc58fa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.027 183181 DEBUG nova.compute.manager [req-f0548477-4530-455a-b84f-15548ed7ba4a req-25bbfac0-dc0c-473c-9bbe-11227550753b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Received event network-vif-plugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.028 183181 DEBUG oslo_concurrency.lockutils [req-f0548477-4530-455a-b84f-15548ed7ba4a req-25bbfac0-dc0c-473c-9bbe-11227550753b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.028 183181 DEBUG oslo_concurrency.lockutils [req-f0548477-4530-455a-b84f-15548ed7ba4a req-25bbfac0-dc0c-473c-9bbe-11227550753b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.028 183181 DEBUG oslo_concurrency.lockutils [req-f0548477-4530-455a-b84f-15548ed7ba4a req-25bbfac0-dc0c-473c-9bbe-11227550753b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.029 183181 DEBUG nova.compute.manager [req-f0548477-4530-455a-b84f-15548ed7ba4a req-25bbfac0-dc0c-473c-9bbe-11227550753b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Processing event network-vif-plugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:41:45 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:45.039 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8386475b-b12b-420a-b91b-8e2ec2243dd9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02893814-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:b9:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388467, 'reachable_time': 43675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206755, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:41:45 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:45.060 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9f1d854b-c615-436f-b1e7-351797de1325]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap02893814-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388481, 'tstamp': 388481}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206756, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap02893814-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388484, 'tstamp': 388484}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206756, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:41:45 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:45.062 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02893814-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.064 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.066 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:45 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:45.066 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02893814-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:41:45 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:45.067 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:41:45 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:45.067 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap02893814-70, col_values=(('external_ids', {'iface-id': '8041d9b1-f920-4d7e-a0aa-621f944e098e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:41:45 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:45.067 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:41:45 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:45.069 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d0be5584-77d4-4e9e-919c-d51b74610ee5]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-02893814-74cb-419e-9539-9a1c8c79b4be\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 02893814-74cb-419e-9539-9a1c8c79b4be\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.251 183181 DEBUG nova.compute.manager [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.254 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.258 183181 INFO nova.virt.libvirt.driver [-] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Instance spawned successfully.
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.258 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.775 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.776 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.776 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.777 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.778 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:41:45 compute-0 nova_compute[183177]: 2026-01-26 19:41:45.779 183181 DEBUG nova.virt.libvirt.driver [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:41:46 compute-0 nova_compute[183177]: 2026-01-26 19:41:46.444 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:46 compute-0 nova_compute[183177]: 2026-01-26 19:41:46.523 183181 INFO nova.compute.manager [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Took 9.45 seconds to spawn the instance on the hypervisor.
Jan 26 19:41:46 compute-0 nova_compute[183177]: 2026-01-26 19:41:46.524 183181 DEBUG nova.compute.manager [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.066 183181 INFO nova.compute.manager [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Took 14.87 seconds to build instance.
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.108 183181 DEBUG nova.compute.manager [req-22b2c61c-9074-47e6-8f28-9c38182de6b9 req-4bbfc124-d837-45ad-9d4d-dfd83c18497f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Received event network-vif-plugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.108 183181 DEBUG oslo_concurrency.lockutils [req-22b2c61c-9074-47e6-8f28-9c38182de6b9 req-4bbfc124-d837-45ad-9d4d-dfd83c18497f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.108 183181 DEBUG oslo_concurrency.lockutils [req-22b2c61c-9074-47e6-8f28-9c38182de6b9 req-4bbfc124-d837-45ad-9d4d-dfd83c18497f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.109 183181 DEBUG oslo_concurrency.lockutils [req-22b2c61c-9074-47e6-8f28-9c38182de6b9 req-4bbfc124-d837-45ad-9d4d-dfd83c18497f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.109 183181 DEBUG nova.compute.manager [req-22b2c61c-9074-47e6-8f28-9c38182de6b9 req-4bbfc124-d837-45ad-9d4d-dfd83c18497f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] No waiting events found dispatching network-vif-plugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.109 183181 WARNING nova.compute.manager [req-22b2c61c-9074-47e6-8f28-9c38182de6b9 req-4bbfc124-d837-45ad-9d4d-dfd83c18497f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Received unexpected event network-vif-plugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b for instance with vm_state active and task_state None.
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.154 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.571 183181 DEBUG oslo_concurrency.lockutils [None req-3f342b23-b0c6-43ec-b276-83a8461a0609 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.395s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.668 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:41:47 compute-0 nova_compute[183177]: 2026-01-26 19:41:47.669 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:41:48 compute-0 nova_compute[183177]: 2026-01-26 19:41:48.733 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:41:48 compute-0 nova_compute[183177]: 2026-01-26 19:41:48.836 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:41:48 compute-0 nova_compute[183177]: 2026-01-26 19:41:48.837 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:41:48 compute-0 nova_compute[183177]: 2026-01-26 19:41:48.891 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:48 compute-0 nova_compute[183177]: 2026-01-26 19:41:48.923 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:41:48 compute-0 nova_compute[183177]: 2026-01-26 19:41:48.928 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:41:48 compute-0 nova_compute[183177]: 2026-01-26 19:41:48.996 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:41:48 compute-0 nova_compute[183177]: 2026-01-26 19:41:48.997 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:41:49 compute-0 nova_compute[183177]: 2026-01-26 19:41:49.060 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:41:49 compute-0 nova_compute[183177]: 2026-01-26 19:41:49.067 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:41:49 compute-0 nova_compute[183177]: 2026-01-26 19:41:49.125 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:41:49 compute-0 nova_compute[183177]: 2026-01-26 19:41:49.126 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:41:49 compute-0 nova_compute[183177]: 2026-01-26 19:41:49.185 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:41:49 compute-0 nova_compute[183177]: 2026-01-26 19:41:49.360 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:41:49 compute-0 nova_compute[183177]: 2026-01-26 19:41:49.363 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:41:49 compute-0 nova_compute[183177]: 2026-01-26 19:41:49.386 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:41:49 compute-0 nova_compute[183177]: 2026-01-26 19:41:49.387 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5345MB free_disk=73.04651641845703GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:41:49 compute-0 nova_compute[183177]: 2026-01-26 19:41:49.387 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:41:49 compute-0 nova_compute[183177]: 2026-01-26 19:41:49.388 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:41:50 compute-0 nova_compute[183177]: 2026-01-26 19:41:50.467 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 04802a55-668d-42ba-bc20-72c2e3f29298 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:41:50 compute-0 nova_compute[183177]: 2026-01-26 19:41:50.468 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance e1a70852-daf7-45b0-849d-957892f3d109 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:41:50 compute-0 nova_compute[183177]: 2026-01-26 19:41:50.468 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance e94ea7ed-8444-4fef-a770-3fbf69f0f3dd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:41:50 compute-0 nova_compute[183177]: 2026-01-26 19:41:50.469 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:41:50 compute-0 nova_compute[183177]: 2026-01-26 19:41:50.469 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:41:49 up  1:06,  0 user,  load average: 0.49, 0.37, 0.47\n', 'num_instances': '3', 'num_vm_active': '3', 'num_task_None': '3', 'num_os_type_None': '3', 'num_proj_577ae27ca8cf44549308a35c420ae86d': '3', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:41:50 compute-0 nova_compute[183177]: 2026-01-26 19:41:50.564 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:41:51 compute-0 nova_compute[183177]: 2026-01-26 19:41:51.074 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:41:51 compute-0 nova_compute[183177]: 2026-01-26 19:41:51.446 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:51 compute-0 nova_compute[183177]: 2026-01-26 19:41:51.591 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:41:51 compute-0 nova_compute[183177]: 2026-01-26 19:41:51.592 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.204s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:41:52 compute-0 podman[206784]: 2026-01-26 19:41:52.395638181 +0000 UTC m=+0.135761101 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 26 19:41:53 compute-0 nova_compute[183177]: 2026-01-26 19:41:53.592 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:41:53 compute-0 nova_compute[183177]: 2026-01-26 19:41:53.593 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:41:53 compute-0 nova_compute[183177]: 2026-01-26 19:41:53.593 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:41:53 compute-0 nova_compute[183177]: 2026-01-26 19:41:53.893 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:55 compute-0 podman[206811]: 2026-01-26 19:41:55.324944034 +0000 UTC m=+0.071001030 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=)
Jan 26 19:41:55 compute-0 podman[206812]: 2026-01-26 19:41:55.344297345 +0000 UTC m=+0.079430397 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 19:41:56 compute-0 nova_compute[183177]: 2026-01-26 19:41:56.449 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:57.050 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:41:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:57.051 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:41:57 compute-0 nova_compute[183177]: 2026-01-26 19:41:57.051 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:41:57.053 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:41:57 compute-0 nova_compute[183177]: 2026-01-26 19:41:57.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:41:57 compute-0 ovn_controller[95396]: 2026-01-26T19:41:57Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:58:3a:33 10.100.0.5
Jan 26 19:41:57 compute-0 ovn_controller[95396]: 2026-01-26T19:41:57Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:58:3a:33 10.100.0.5
Jan 26 19:41:58 compute-0 nova_compute[183177]: 2026-01-26 19:41:58.896 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:41:59 compute-0 podman[192499]: time="2026-01-26T19:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:41:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:41:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2634 "" "Go-http-client/1.1"
Jan 26 19:42:01 compute-0 openstack_network_exporter[195363]: ERROR   19:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:42:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:42:01 compute-0 openstack_network_exporter[195363]: ERROR   19:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:42:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:42:01 compute-0 nova_compute[183177]: 2026-01-26 19:42:01.450 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:01 compute-0 anacron[30974]: Job `cron.daily' started
Jan 26 19:42:01 compute-0 anacron[30974]: Job `cron.daily' terminated
Jan 26 19:42:03 compute-0 podman[206876]: 2026-01-26 19:42:03.323833749 +0000 UTC m=+0.070761133 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:42:03 compute-0 nova_compute[183177]: 2026-01-26 19:42:03.898 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:06 compute-0 nova_compute[183177]: 2026-01-26 19:42:06.452 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:08 compute-0 nova_compute[183177]: 2026-01-26 19:42:08.901 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:11 compute-0 nova_compute[183177]: 2026-01-26 19:42:11.453 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:13 compute-0 nova_compute[183177]: 2026-01-26 19:42:13.903 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:16 compute-0 nova_compute[183177]: 2026-01-26 19:42:16.455 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:18 compute-0 nova_compute[183177]: 2026-01-26 19:42:18.905 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:21 compute-0 nova_compute[183177]: 2026-01-26 19:42:21.456 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:23 compute-0 podman[206903]: 2026-01-26 19:42:23.346087797 +0000 UTC m=+0.092502037 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260120, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:42:23 compute-0 nova_compute[183177]: 2026-01-26 19:42:23.936 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:24.039 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:24.039 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:24.040 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:25 compute-0 nova_compute[183177]: 2026-01-26 19:42:25.767 183181 DEBUG nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Check if temp file /var/lib/nova/instances/tmp9tm0vvr7 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 19:42:25 compute-0 nova_compute[183177]: 2026-01-26 19:42:25.774 183181 DEBUG nova.compute.manager [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9tm0vvr7',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e1a70852-daf7-45b0-849d-957892f3d109',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 19:42:26 compute-0 podman[206930]: 2026-01-26 19:42:26.3516319 +0000 UTC m=+0.102677941 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=)
Jan 26 19:42:26 compute-0 podman[206931]: 2026-01-26 19:42:26.354303722 +0000 UTC m=+0.091124921 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 19:42:26 compute-0 nova_compute[183177]: 2026-01-26 19:42:26.458 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:28 compute-0 nova_compute[183177]: 2026-01-26 19:42:28.376 183181 DEBUG oslo_concurrency.lockutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:42:28 compute-0 nova_compute[183177]: 2026-01-26 19:42:28.376 183181 DEBUG oslo_concurrency.lockutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:42:28 compute-0 nova_compute[183177]: 2026-01-26 19:42:28.377 183181 DEBUG nova.network.neutron [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:42:28 compute-0 nova_compute[183177]: 2026-01-26 19:42:28.889 183181 WARNING neutronclient.v2_0.client [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:42:28 compute-0 nova_compute[183177]: 2026-01-26 19:42:28.939 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:29 compute-0 podman[192499]: time="2026-01-26T19:42:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:42:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:42:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:42:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:42:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2634 "" "Go-http-client/1.1"
Jan 26 19:42:30 compute-0 nova_compute[183177]: 2026-01-26 19:42:30.003 183181 WARNING neutronclient.v2_0.client [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:42:30 compute-0 nova_compute[183177]: 2026-01-26 19:42:30.543 183181 DEBUG nova.network.neutron [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Updating instance_info_cache with network_info: [{"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:42:30 compute-0 nova_compute[183177]: 2026-01-26 19:42:30.924 183181 DEBUG oslo_concurrency.processutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:31 compute-0 nova_compute[183177]: 2026-01-26 19:42:31.012 183181 DEBUG oslo_concurrency.processutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:31 compute-0 nova_compute[183177]: 2026-01-26 19:42:31.014 183181 DEBUG oslo_concurrency.processutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:31 compute-0 nova_compute[183177]: 2026-01-26 19:42:31.051 183181 DEBUG oslo_concurrency.lockutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:42:31 compute-0 nova_compute[183177]: 2026-01-26 19:42:31.099 183181 DEBUG oslo_concurrency.processutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:31 compute-0 nova_compute[183177]: 2026-01-26 19:42:31.101 183181 DEBUG nova.compute.manager [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Preparing to wait for external event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:42:31 compute-0 nova_compute[183177]: 2026-01-26 19:42:31.101 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:31 compute-0 nova_compute[183177]: 2026-01-26 19:42:31.102 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:31 compute-0 nova_compute[183177]: 2026-01-26 19:42:31.102 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:31 compute-0 openstack_network_exporter[195363]: ERROR   19:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:42:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:42:31 compute-0 openstack_network_exporter[195363]: ERROR   19:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:42:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:42:31 compute-0 nova_compute[183177]: 2026-01-26 19:42:31.460 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:32 compute-0 nova_compute[183177]: 2026-01-26 19:42:32.596 183181 DEBUG nova.virt.libvirt.driver [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12417
Jan 26 19:42:32 compute-0 nova_compute[183177]: 2026-01-26 19:42:32.597 183181 DEBUG nova.virt.libvirt.volume.remotefs [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Creating file /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/69767ce18211466fa438d0d5a04a9149.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 26 19:42:32 compute-0 nova_compute[183177]: 2026-01-26 19:42:32.597 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/69767ce18211466fa438d0d5a04a9149.tmp execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:33 compute-0 nova_compute[183177]: 2026-01-26 19:42:33.015 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/69767ce18211466fa438d0d5a04a9149.tmp" returned: 1 in 0.418s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:33 compute-0 nova_compute[183177]: 2026-01-26 19:42:33.017 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/69767ce18211466fa438d0d5a04a9149.tmp' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Jan 26 19:42:33 compute-0 nova_compute[183177]: 2026-01-26 19:42:33.017 183181 DEBUG nova.virt.libvirt.volume.remotefs [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Creating directory /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd on remote host 192.168.122.101 create_dir /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 26 19:42:33 compute-0 nova_compute[183177]: 2026-01-26 19:42:33.018 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:33 compute-0 nova_compute[183177]: 2026-01-26 19:42:33.222 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" returned: 0 in 0.204s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:33 compute-0 nova_compute[183177]: 2026-01-26 19:42:33.228 183181 DEBUG nova.virt.libvirt.driver [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4247
Jan 26 19:42:33 compute-0 nova_compute[183177]: 2026-01-26 19:42:33.973 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:34 compute-0 podman[206980]: 2026-01-26 19:42:34.315394861 +0000 UTC m=+0.049056750 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:42:35 compute-0 kernel: tap565fd712-31 (unregistering): left promiscuous mode
Jan 26 19:42:35 compute-0 NetworkManager[55489]: <info>  [1769456555.4676] device (tap565fd712-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:42:35 compute-0 ovn_controller[95396]: 2026-01-26T19:42:35Z|00079|binding|INFO|Releasing lport 565fd712-31bf-40ce-ae6d-d74e1bfee02b from this chassis (sb_readonly=0)
Jan 26 19:42:35 compute-0 ovn_controller[95396]: 2026-01-26T19:42:35Z|00080|binding|INFO|Setting lport 565fd712-31bf-40ce-ae6d-d74e1bfee02b down in Southbound
Jan 26 19:42:35 compute-0 nova_compute[183177]: 2026-01-26 19:42:35.473 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:35 compute-0 ovn_controller[95396]: 2026-01-26T19:42:35Z|00081|binding|INFO|Removing iface tap565fd712-31 ovn-installed in OVS
Jan 26 19:42:35 compute-0 nova_compute[183177]: 2026-01-26 19:42:35.476 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.499 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:3a:33 10.100.0.5'], port_security=['fa:16:3e:58:3a:33 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e94ea7ed-8444-4fef-a770-3fbf69f0f3dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02893814-74cb-419e-9539-9a1c8c79b4be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '577ae27ca8cf44549308a35c420ae86d', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'e6bd7c8f-1625-4087-a64f-b17674d29af3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=541eb2a4-ee15-4d4c-b84c-5746be40fade, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=565fd712-31bf-40ce-ae6d-d74e1bfee02b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.501 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 565fd712-31bf-40ce-ae6d-d74e1bfee02b in datapath 02893814-74cb-419e-9539-9a1c8c79b4be unbound from our chassis
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.502 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 02893814-74cb-419e-9539-9a1c8c79b4be
Jan 26 19:42:35 compute-0 nova_compute[183177]: 2026-01-26 19:42:35.506 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.530 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[26beb66c-c9aa-41d9-8280-68b48412fe72]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:35 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Deactivated successfully.
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.578 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[ab8ad780-56ff-4a28-ac6c-9b10c1728ee3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:35 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Consumed 15.626s CPU time.
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.582 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[d4f06fb8-92de-41dd-9d03-2f49ffa98f4e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:35 compute-0 systemd-machined[154465]: Machine qemu-6-instance-00000008 terminated.
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.624 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[d4eec56b-21a7-444c-99ac-40252096520c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.653 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6e201f25-632c-402a-92cf-8c45a45d9528]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02893814-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:b9:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388467, 'reachable_time': 43675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207016, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.679 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[168f2f79-fa58-4681-b840-b43b1da20584]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap02893814-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388481, 'tstamp': 388481}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207017, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap02893814-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388484, 'tstamp': 388484}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207017, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.681 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02893814-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:42:35 compute-0 nova_compute[183177]: 2026-01-26 19:42:35.683 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:35 compute-0 nova_compute[183177]: 2026-01-26 19:42:35.690 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.691 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02893814-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.691 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.692 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap02893814-70, col_values=(('external_ids', {'iface-id': '8041d9b1-f920-4d7e-a0aa-621f944e098e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.692 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:42:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:35.694 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d0822d17-e5c6-4736-a99d-620640741c90]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-02893814-74cb-419e-9539-9a1c8c79b4be\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 02893814-74cb-419e-9539-9a1c8c79b4be\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:35 compute-0 nova_compute[183177]: 2026-01-26 19:42:35.737 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:35 compute-0 nova_compute[183177]: 2026-01-26 19:42:35.744 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.257 183181 INFO nova.virt.libvirt.driver [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Instance shutdown successfully after 3 seconds.
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.263 183181 INFO nova.virt.libvirt.driver [-] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Instance destroyed successfully.
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.265 183181 DEBUG nova.virt.libvirt.vif [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:41:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-266020774',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-266020774',id=8,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:41:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-96b5d3r0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:42:22Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=e94ea7ed-8444-4fef-a770-3fbf69f0f3dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "vif_mac": "fa:16:3e:58:3a:33"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.266 183181 DEBUG nova.network.os_vif_util [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "vif_mac": "fa:16:3e:58:3a:33"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.267 183181 DEBUG nova.network.os_vif_util [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:58:3a:33,bridge_name='br-int',has_traffic_filtering=True,id=565fd712-31bf-40ce-ae6d-d74e1bfee02b,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap565fd712-31') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.268 183181 DEBUG os_vif [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:3a:33,bridge_name='br-int',has_traffic_filtering=True,id=565fd712-31bf-40ce-ae6d-d74e1bfee02b,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap565fd712-31') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.272 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.273 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap565fd712-31, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.311 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.316 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.317 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.318 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=25242eb0-f2b9-4de8-842c-34cb7826d472) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.319 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.321 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.325 183181 INFO os_vif [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:3a:33,bridge_name='br-int',has_traffic_filtering=True,id=565fd712-31bf-40ce-ae6d-d74e1bfee02b,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap565fd712-31')
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.330 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.419 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.421 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.508 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.511 183181 DEBUG nova.virt.libvirt.volume.remotefs [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Copying file /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd_resize/disk to 192.168.122.101:/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.512 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd_resize/disk 192.168.122.101:/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.574 183181 DEBUG nova.compute.manager [req-9b6b6f31-5cec-44b6-be6a-d76baa161bf0 req-5d2ab057-cbc3-47cb-a478-855bb51e35ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Received event network-vif-unplugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.575 183181 DEBUG oslo_concurrency.lockutils [req-9b6b6f31-5cec-44b6-be6a-d76baa161bf0 req-5d2ab057-cbc3-47cb-a478-855bb51e35ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.576 183181 DEBUG oslo_concurrency.lockutils [req-9b6b6f31-5cec-44b6-be6a-d76baa161bf0 req-5d2ab057-cbc3-47cb-a478-855bb51e35ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.576 183181 DEBUG oslo_concurrency.lockutils [req-9b6b6f31-5cec-44b6-be6a-d76baa161bf0 req-5d2ab057-cbc3-47cb-a478-855bb51e35ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.576 183181 DEBUG nova.compute.manager [req-9b6b6f31-5cec-44b6-be6a-d76baa161bf0 req-5d2ab057-cbc3-47cb-a478-855bb51e35ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] No waiting events found dispatching network-vif-unplugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.577 183181 WARNING nova.compute.manager [req-9b6b6f31-5cec-44b6-be6a-d76baa161bf0 req-5d2ab057-cbc3-47cb-a478-855bb51e35ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Received unexpected event network-vif-unplugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b for instance with vm_state active and task_state resize_migrating.
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.951 183181 DEBUG nova.compute.manager [req-64765c6c-9464-4710-abf7-fe32c6a357b7 req-cf5d267e-2cb0-4185-a93f-59690895de92 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-unplugged-324164fa-164b-418a-ba25-b2508e836f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.951 183181 DEBUG oslo_concurrency.lockutils [req-64765c6c-9464-4710-abf7-fe32c6a357b7 req-cf5d267e-2cb0-4185-a93f-59690895de92 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.952 183181 DEBUG oslo_concurrency.lockutils [req-64765c6c-9464-4710-abf7-fe32c6a357b7 req-cf5d267e-2cb0-4185-a93f-59690895de92 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.953 183181 DEBUG oslo_concurrency.lockutils [req-64765c6c-9464-4710-abf7-fe32c6a357b7 req-cf5d267e-2cb0-4185-a93f-59690895de92 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.954 183181 DEBUG nova.compute.manager [req-64765c6c-9464-4710-abf7-fe32c6a357b7 req-cf5d267e-2cb0-4185-a93f-59690895de92 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] No event matching network-vif-unplugged-324164fa-164b-418a-ba25-b2508e836f80 in dict_keys([('network-vif-plugged', '324164fa-164b-418a-ba25-b2508e836f80')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 19:42:36 compute-0 nova_compute[183177]: 2026-01-26 19:42:36.955 183181 DEBUG nova.compute.manager [req-64765c6c-9464-4710-abf7-fe32c6a357b7 req-cf5d267e-2cb0-4185-a93f-59690895de92 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-unplugged-324164fa-164b-418a-ba25-b2508e836f80 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:42:37 compute-0 nova_compute[183177]: 2026-01-26 19:42:37.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:42:37 compute-0 nova_compute[183177]: 2026-01-26 19:42:37.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 19:42:37 compute-0 nova_compute[183177]: 2026-01-26 19:42:37.172 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "scp -r /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd_resize/disk 192.168.122.101:/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk" returned: 0 in 0.660s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:37 compute-0 nova_compute[183177]: 2026-01-26 19:42:37.172 183181 DEBUG nova.virt.libvirt.volume.remotefs [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Copying file /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk.config copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 26 19:42:37 compute-0 nova_compute[183177]: 2026-01-26 19:42:37.173 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd_resize/disk.config 192.168.122.101:/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk.config execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:37 compute-0 nova_compute[183177]: 2026-01-26 19:42:37.516 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "scp -C -r /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd_resize/disk.config 192.168.122.101:/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk.config" returned: 0 in 0.343s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:37 compute-0 nova_compute[183177]: 2026-01-26 19:42:37.517 183181 DEBUG nova.virt.libvirt.volume.remotefs [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Copying file /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk.info copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 26 19:42:37 compute-0 nova_compute[183177]: 2026-01-26 19:42:37.517 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd_resize/disk.info 192.168.122.101:/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk.info execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:37 compute-0 nova_compute[183177]: 2026-01-26 19:42:37.661 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 19:42:37 compute-0 nova_compute[183177]: 2026-01-26 19:42:37.732 183181 DEBUG oslo_concurrency.processutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "scp -C -r /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd_resize/disk.info 192.168.122.101:/var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk.info" returned: 0 in 0.215s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:37 compute-0 nova_compute[183177]: 2026-01-26 19:42:37.735 183181 WARNING neutronclient.v2_0.client [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:42:37 compute-0 nova_compute[183177]: 2026-01-26 19:42:37.735 183181 WARNING neutronclient.v2_0.client [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:42:38 compute-0 nova_compute[183177]: 2026-01-26 19:42:38.127 183181 INFO nova.compute.manager [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Took 7.02 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 26 19:42:38 compute-0 nova_compute[183177]: 2026-01-26 19:42:38.438 183181 DEBUG neutronclient.v2_0.client [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 565fd712-31bf-40ce-ae6d-d74e1bfee02b for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.12/site-packages/neutronclient/v2_0/client.py:265
Jan 26 19:42:38 compute-0 nova_compute[183177]: 2026-01-26 19:42:38.635 183181 DEBUG nova.compute.manager [req-d531a2f9-f2eb-415c-84a3-357fe0cb7c42 req-4fae3815-8c24-44de-8f94-962c0b8f7150 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Received event network-vif-unplugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:38 compute-0 nova_compute[183177]: 2026-01-26 19:42:38.635 183181 DEBUG oslo_concurrency.lockutils [req-d531a2f9-f2eb-415c-84a3-357fe0cb7c42 req-4fae3815-8c24-44de-8f94-962c0b8f7150 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:38 compute-0 nova_compute[183177]: 2026-01-26 19:42:38.636 183181 DEBUG oslo_concurrency.lockutils [req-d531a2f9-f2eb-415c-84a3-357fe0cb7c42 req-4fae3815-8c24-44de-8f94-962c0b8f7150 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:38 compute-0 nova_compute[183177]: 2026-01-26 19:42:38.637 183181 DEBUG oslo_concurrency.lockutils [req-d531a2f9-f2eb-415c-84a3-357fe0cb7c42 req-4fae3815-8c24-44de-8f94-962c0b8f7150 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:38 compute-0 nova_compute[183177]: 2026-01-26 19:42:38.637 183181 DEBUG nova.compute.manager [req-d531a2f9-f2eb-415c-84a3-357fe0cb7c42 req-4fae3815-8c24-44de-8f94-962c0b8f7150 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] No waiting events found dispatching network-vif-unplugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:42:38 compute-0 nova_compute[183177]: 2026-01-26 19:42:38.638 183181 WARNING nova.compute.manager [req-d531a2f9-f2eb-415c-84a3-357fe0cb7c42 req-4fae3815-8c24-44de-8f94-962c0b8f7150 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Received unexpected event network-vif-unplugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b for instance with vm_state active and task_state resize_migrating.
Jan 26 19:42:38 compute-0 nova_compute[183177]: 2026-01-26 19:42:38.976 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.037 183181 DEBUG nova.compute.manager [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.037 183181 DEBUG oslo_concurrency.lockutils [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.038 183181 DEBUG oslo_concurrency.lockutils [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.038 183181 DEBUG oslo_concurrency.lockutils [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.038 183181 DEBUG nova.compute.manager [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Processing event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.039 183181 DEBUG nova.compute.manager [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-changed-324164fa-164b-418a-ba25-b2508e836f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.039 183181 DEBUG nova.compute.manager [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Refreshing instance network info cache due to event network-changed-324164fa-164b-418a-ba25-b2508e836f80. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.039 183181 DEBUG oslo_concurrency.lockutils [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-e1a70852-daf7-45b0-849d-957892f3d109" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.040 183181 DEBUG oslo_concurrency.lockutils [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-e1a70852-daf7-45b0-849d-957892f3d109" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.040 183181 DEBUG nova.network.neutron [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Refreshing network info cache for port 324164fa-164b-418a-ba25-b2508e836f80 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.042 183181 DEBUG nova.compute.manager [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.498 183181 DEBUG oslo_concurrency.lockutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.498 183181 DEBUG oslo_concurrency.lockutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.499 183181 DEBUG oslo_concurrency.lockutils [None req-bdb0297d-659f-445e-885b-19b4991d7b78 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.549 183181 WARNING neutronclient.v2_0.client [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:42:39 compute-0 nova_compute[183177]: 2026-01-26 19:42:39.554 183181 DEBUG nova.compute.manager [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9tm0vvr7',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e1a70852-daf7-45b0-849d-957892f3d109',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(456beb37-331c-4beb-8a2c-8b5a3d0f9102),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.069 183181 DEBUG nova.objects.instance [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid e1a70852-daf7-45b0-849d-957892f3d109 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.072 183181 DEBUG nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.075 183181 DEBUG nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.075 183181 DEBUG nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.434 183181 WARNING neutronclient.v2_0.client [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.578 183181 DEBUG nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.578 183181 DEBUG nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.586 183181 DEBUG nova.virt.libvirt.vif [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:40:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-51219326',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-51219326',id=6,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:40:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-l1d7bs6v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:40:58Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=e1a70852-daf7-45b0-849d-957892f3d109,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.587 183181 DEBUG nova.network.os_vif_util [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.588 183181 DEBUG nova.network.os_vif_util [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:5d,bridge_name='br-int',has_traffic_filtering=True,id=324164fa-164b-418a-ba25-b2508e836f80,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap324164fa-16') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.589 183181 DEBUG nova.virt.libvirt.migration [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <mac address="fa:16:3e:b4:a5:5d"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <model type="virtio"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <mtu size="1442"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <target dev="tap324164fa-16"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]: </interface>
Jan 26 19:42:40 compute-0 nova_compute[183177]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.590 183181 DEBUG nova.virt.libvirt.migration [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <name>instance-00000006</name>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <uuid>e1a70852-daf7-45b0-849d-957892f3d109</uuid>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-51219326</nova:name>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:40:52</nova:creationTime>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:42:40 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:42:40 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:user uuid="ee1e0029a6ac4b56b09c165dc3cd4dda">tempest-TestExecuteActionsViaActuator-1232791976-project-admin</nova:user>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:project uuid="577ae27ca8cf44549308a35c420ae86d">tempest-TestExecuteActionsViaActuator-1232791976</nova:project>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:port uuid="324164fa-164b-418a-ba25-b2508e836f80">
Jan 26 19:42:40 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <system>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="serial">e1a70852-daf7-45b0-849d-957892f3d109</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="uuid">e1a70852-daf7-45b0-849d-957892f3d109</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </system>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <os>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </os>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <features>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </features>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk.config"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:b4:a5:5d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap324164fa-16"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/console.log" append="off"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </target>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/console.log" append="off"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </console>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </input>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <video>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </video>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]: </domain>
Jan 26 19:42:40 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.590 183181 DEBUG nova.virt.libvirt.migration [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <name>instance-00000006</name>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <uuid>e1a70852-daf7-45b0-849d-957892f3d109</uuid>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-51219326</nova:name>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:40:52</nova:creationTime>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:42:40 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:42:40 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:user uuid="ee1e0029a6ac4b56b09c165dc3cd4dda">tempest-TestExecuteActionsViaActuator-1232791976-project-admin</nova:user>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:project uuid="577ae27ca8cf44549308a35c420ae86d">tempest-TestExecuteActionsViaActuator-1232791976</nova:project>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:port uuid="324164fa-164b-418a-ba25-b2508e836f80">
Jan 26 19:42:40 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <system>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="serial">e1a70852-daf7-45b0-849d-957892f3d109</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="uuid">e1a70852-daf7-45b0-849d-957892f3d109</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </system>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <os>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </os>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <features>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </features>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk.config"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:b4:a5:5d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap324164fa-16"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/console.log" append="off"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </target>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/console.log" append="off"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </console>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </input>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <video>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </video>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]: </domain>
Jan 26 19:42:40 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.591 183181 DEBUG nova.virt.libvirt.migration [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <name>instance-00000006</name>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <uuid>e1a70852-daf7-45b0-849d-957892f3d109</uuid>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-51219326</nova:name>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:40:52</nova:creationTime>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:42:40 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:42:40 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:user uuid="ee1e0029a6ac4b56b09c165dc3cd4dda">tempest-TestExecuteActionsViaActuator-1232791976-project-admin</nova:user>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:project uuid="577ae27ca8cf44549308a35c420ae86d">tempest-TestExecuteActionsViaActuator-1232791976</nova:project>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <nova:port uuid="324164fa-164b-418a-ba25-b2508e836f80">
Jan 26 19:42:40 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <system>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="serial">e1a70852-daf7-45b0-849d-957892f3d109</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="uuid">e1a70852-daf7-45b0-849d-957892f3d109</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </system>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <os>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </os>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <features>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </features>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/disk.config"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:b4:a5:5d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap324164fa-16"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/console.log" append="off"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:42:40 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       </target>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109/console.log" append="off"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </console>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </input>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <video>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </video>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:42:40 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:42:40 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:42:40 compute-0 nova_compute[183177]: </domain>
Jan 26 19:42:40 compute-0 nova_compute[183177]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 19:42:40 compute-0 nova_compute[183177]: 2026-01-26 19:42:40.591 183181 DEBUG nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 19:42:41 compute-0 nova_compute[183177]: 2026-01-26 19:42:41.082 183181 DEBUG nova.virt.libvirt.migration [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 19:42:41 compute-0 nova_compute[183177]: 2026-01-26 19:42:41.082 183181 INFO nova.virt.libvirt.migration [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 19:42:41 compute-0 nova_compute[183177]: 2026-01-26 19:42:41.334 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:41 compute-0 nova_compute[183177]: 2026-01-26 19:42:41.532 183181 DEBUG nova.network.neutron [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Updated VIF entry in instance network info cache for port 324164fa-164b-418a-ba25-b2508e836f80. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 19:42:41 compute-0 nova_compute[183177]: 2026-01-26 19:42:41.533 183181 DEBUG nova.network.neutron [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Updating instance_info_cache with network_info: [{"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:42:42 compute-0 nova_compute[183177]: 2026-01-26 19:42:42.040 183181 DEBUG oslo_concurrency.lockutils [req-9524b6c9-33cc-4f5b-a4f1-67b209c808c6 req-aab1f4ad-6001-46ae-87f2-671d1e44bc42 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-e1a70852-daf7-45b0-849d-957892f3d109" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:42:42 compute-0 nova_compute[183177]: 2026-01-26 19:42:42.102 183181 INFO nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 19:42:42 compute-0 nova_compute[183177]: 2026-01-26 19:42:42.606 183181 DEBUG nova.virt.libvirt.migration [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 19:42:42 compute-0 nova_compute[183177]: 2026-01-26 19:42:42.606 183181 DEBUG nova.virt.libvirt.migration [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.111 183181 DEBUG nova.virt.libvirt.migration [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.112 183181 DEBUG nova.virt.libvirt.migration [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 19:42:43 compute-0 kernel: tap324164fa-16 (unregistering): left promiscuous mode
Jan 26 19:42:43 compute-0 NetworkManager[55489]: <info>  [1769456563.5829] device (tap324164fa-16): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:42:43 compute-0 ovn_controller[95396]: 2026-01-26T19:42:43Z|00082|binding|INFO|Releasing lport 324164fa-164b-418a-ba25-b2508e836f80 from this chassis (sb_readonly=0)
Jan 26 19:42:43 compute-0 ovn_controller[95396]: 2026-01-26T19:42:43Z|00083|binding|INFO|Setting lport 324164fa-164b-418a-ba25-b2508e836f80 down in Southbound
Jan 26 19:42:43 compute-0 ovn_controller[95396]: 2026-01-26T19:42:43Z|00084|binding|INFO|Removing iface tap324164fa-16 ovn-installed in OVS
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.634 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.638 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.642 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:a5:5d 10.100.0.8'], port_security=['fa:16:3e:b4:a5:5d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c86a094a-ada8-46ad-9c23-c857e9a7b834'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e1a70852-daf7-45b0-849d-957892f3d109', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02893814-74cb-419e-9539-9a1c8c79b4be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '577ae27ca8cf44549308a35c420ae86d', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'e6bd7c8f-1625-4087-a64f-b17674d29af3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=541eb2a4-ee15-4d4c-b84c-5746be40fade, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=324164fa-164b-418a-ba25-b2508e836f80) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.644 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 324164fa-164b-418a-ba25-b2508e836f80 in datapath 02893814-74cb-419e-9539-9a1c8c79b4be unbound from our chassis
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.647 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 02893814-74cb-419e-9539-9a1c8c79b4be
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.650 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.680 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[344ca5f7-77ce-4110-a087-aa49c06a9071]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:43 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Deactivated successfully.
Jan 26 19:42:43 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Consumed 18.610s CPU time.
Jan 26 19:42:43 compute-0 systemd-machined[154465]: Machine qemu-5-instance-00000006 terminated.
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.734 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[1fdfbfd3-ee67-4324-b567-620dfc667773]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.738 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[33936daa-63f8-4ff6-ade0-e7c46c8ff172]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.788 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[bb8d502f-e5ab-4234-8e0f-363c598c849e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.794 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.805 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.821 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[09342e3b-efa7-46b1-8b7c-69b3d7bf4d3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02893814-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7c:b9:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388467, 'reachable_time': 43675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207083, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.844 183181 DEBUG nova.virt.libvirt.guest [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.846 183181 INFO nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Migration operation has completed
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.846 183181 INFO nova.compute.manager [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] _post_live_migration() is started..
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.850 183181 DEBUG nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.851 183181 DEBUG nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.851 183181 DEBUG nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.849 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[028abdc9-5213-4a6e-b23f-ff896a32e1f6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap02893814-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388481, 'tstamp': 388481}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207094, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap02893814-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388484, 'tstamp': 388484}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207094, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.852 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02893814-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.858 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.860 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.861 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02893814-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.862 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.862 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap02893814-70, col_values=(('external_ids', {'iface-id': '8041d9b1-f920-4d7e-a0aa-621f944e098e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.863 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.865 183181 WARNING neutronclient.v2_0.client [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.865 183181 WARNING neutronclient.v2_0.client [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:42:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:42:43.865 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9bec28db-b138-4f87-917d-bbb96f09fb4d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-02893814-74cb-419e-9539-9a1c8c79b4be\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 02893814-74cb-419e-9539-9a1c8c79b4be\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.899 183181 DEBUG nova.compute.manager [req-647747f7-2d6b-4d59-a0c1-245c2023ab31 req-0ee551ec-f071-4b97-8c9f-548a2acedd83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Received event network-changed-565fd712-31bf-40ce-ae6d-d74e1bfee02b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.900 183181 DEBUG nova.compute.manager [req-647747f7-2d6b-4d59-a0c1-245c2023ab31 req-0ee551ec-f071-4b97-8c9f-548a2acedd83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Refreshing instance network info cache due to event network-changed-565fd712-31bf-40ce-ae6d-d74e1bfee02b. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.900 183181 DEBUG oslo_concurrency.lockutils [req-647747f7-2d6b-4d59-a0c1-245c2023ab31 req-0ee551ec-f071-4b97-8c9f-548a2acedd83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.900 183181 DEBUG oslo_concurrency.lockutils [req-647747f7-2d6b-4d59-a0c1-245c2023ab31 req-0ee551ec-f071-4b97-8c9f-548a2acedd83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.901 183181 DEBUG nova.network.neutron [req-647747f7-2d6b-4d59-a0c1-245c2023ab31 req-0ee551ec-f071-4b97-8c9f-548a2acedd83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Refreshing network info cache for port 565fd712-31bf-40ce-ae6d-d74e1bfee02b _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:42:43 compute-0 nova_compute[183177]: 2026-01-26 19:42:43.979 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.257 183181 DEBUG nova.compute.manager [req-68485e58-94ac-4d6a-9e7e-2f189bb14417 req-f8386965-4507-4f94-9c6b-278af405680b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-unplugged-324164fa-164b-418a-ba25-b2508e836f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.258 183181 DEBUG oslo_concurrency.lockutils [req-68485e58-94ac-4d6a-9e7e-2f189bb14417 req-f8386965-4507-4f94-9c6b-278af405680b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.258 183181 DEBUG oslo_concurrency.lockutils [req-68485e58-94ac-4d6a-9e7e-2f189bb14417 req-f8386965-4507-4f94-9c6b-278af405680b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.258 183181 DEBUG oslo_concurrency.lockutils [req-68485e58-94ac-4d6a-9e7e-2f189bb14417 req-f8386965-4507-4f94-9c6b-278af405680b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.259 183181 DEBUG nova.compute.manager [req-68485e58-94ac-4d6a-9e7e-2f189bb14417 req-f8386965-4507-4f94-9c6b-278af405680b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] No waiting events found dispatching network-vif-unplugged-324164fa-164b-418a-ba25-b2508e836f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.259 183181 DEBUG nova.compute.manager [req-68485e58-94ac-4d6a-9e7e-2f189bb14417 req-f8386965-4507-4f94-9c6b-278af405680b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-unplugged-324164fa-164b-418a-ba25-b2508e836f80 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.409 183181 WARNING neutronclient.v2_0.client [req-647747f7-2d6b-4d59-a0c1-245c2023ab31 req-0ee551ec-f071-4b97-8c9f-548a2acedd83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.577 183181 DEBUG nova.network.neutron [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Activated binding for port 324164fa-164b-418a-ba25-b2508e836f80 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.578 183181 DEBUG nova.compute.manager [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.579 183181 DEBUG nova.virt.libvirt.vif [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:40:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-51219326',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-51219326',id=6,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:40:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-l1d7bs6v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:42:20Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=e1a70852-daf7-45b0-849d-957892f3d109,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.580 183181 DEBUG nova.network.os_vif_util [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "324164fa-164b-418a-ba25-b2508e836f80", "address": "fa:16:3e:b4:a5:5d", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap324164fa-16", "ovs_interfaceid": "324164fa-164b-418a-ba25-b2508e836f80", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.581 183181 DEBUG nova.network.os_vif_util [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:5d,bridge_name='br-int',has_traffic_filtering=True,id=324164fa-164b-418a-ba25-b2508e836f80,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap324164fa-16') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.582 183181 DEBUG os_vif [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:5d,bridge_name='br-int',has_traffic_filtering=True,id=324164fa-164b-418a-ba25-b2508e836f80,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap324164fa-16') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.586 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.587 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap324164fa-16, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.590 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.593 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.594 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.595 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4668179b-9f24-4134-86de-fb08dea07ce0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.595 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.596 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.597 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.601 183181 INFO os_vif [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:a5:5d,bridge_name='br-int',has_traffic_filtering=True,id=324164fa-164b-418a-ba25-b2508e836f80,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap324164fa-16')
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.602 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.602 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.603 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.603 183181 DEBUG nova.compute.manager [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.604 183181 INFO nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Deleting instance files /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109_del
Jan 26 19:42:44 compute-0 nova_compute[183177]: 2026-01-26 19:42:44.605 183181 INFO nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Deletion of /var/lib/nova/instances/e1a70852-daf7-45b0-849d-957892f3d109_del complete
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.053 183181 DEBUG nova.compute.manager [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-unplugged-324164fa-164b-418a-ba25-b2508e836f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.054 183181 DEBUG oslo_concurrency.lockutils [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.054 183181 DEBUG oslo_concurrency.lockutils [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.054 183181 DEBUG oslo_concurrency.lockutils [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.055 183181 DEBUG nova.compute.manager [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] No waiting events found dispatching network-vif-unplugged-324164fa-164b-418a-ba25-b2508e836f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.055 183181 DEBUG nova.compute.manager [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-unplugged-324164fa-164b-418a-ba25-b2508e836f80 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.055 183181 DEBUG nova.compute.manager [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.056 183181 DEBUG oslo_concurrency.lockutils [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.056 183181 DEBUG oslo_concurrency.lockutils [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.056 183181 DEBUG oslo_concurrency.lockutils [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.057 183181 DEBUG nova.compute.manager [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] No waiting events found dispatching network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.057 183181 WARNING nova.compute.manager [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received unexpected event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 for instance with vm_state active and task_state migrating.
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.057 183181 DEBUG nova.compute.manager [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-unplugged-324164fa-164b-418a-ba25-b2508e836f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.058 183181 DEBUG oslo_concurrency.lockutils [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.058 183181 DEBUG oslo_concurrency.lockutils [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.058 183181 DEBUG oslo_concurrency.lockutils [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.059 183181 DEBUG nova.compute.manager [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] No waiting events found dispatching network-vif-unplugged-324164fa-164b-418a-ba25-b2508e836f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.059 183181 DEBUG nova.compute.manager [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-unplugged-324164fa-164b-418a-ba25-b2508e836f80 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.059 183181 DEBUG nova.compute.manager [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.059 183181 DEBUG oslo_concurrency.lockutils [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.060 183181 DEBUG oslo_concurrency.lockutils [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.060 183181 DEBUG oslo_concurrency.lockutils [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.061 183181 DEBUG nova.compute.manager [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] No waiting events found dispatching network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.061 183181 WARNING nova.compute.manager [req-b6f93fb3-da3a-4d2c-b29b-86f1136a9e0e req-727c3d87-1560-46f0-972c-0c2b18c1a53a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received unexpected event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 for instance with vm_state active and task_state migrating.
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.308 183181 DEBUG nova.compute.manager [req-fbd0700a-7d7a-433c-bf51-382d42b21638 req-d45496cf-aeb3-4c15-9a00-f319e9af7739 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.308 183181 DEBUG oslo_concurrency.lockutils [req-fbd0700a-7d7a-433c-bf51-382d42b21638 req-d45496cf-aeb3-4c15-9a00-f319e9af7739 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.309 183181 DEBUG oslo_concurrency.lockutils [req-fbd0700a-7d7a-433c-bf51-382d42b21638 req-d45496cf-aeb3-4c15-9a00-f319e9af7739 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.309 183181 DEBUG oslo_concurrency.lockutils [req-fbd0700a-7d7a-433c-bf51-382d42b21638 req-d45496cf-aeb3-4c15-9a00-f319e9af7739 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.309 183181 DEBUG nova.compute.manager [req-fbd0700a-7d7a-433c-bf51-382d42b21638 req-d45496cf-aeb3-4c15-9a00-f319e9af7739 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] No waiting events found dispatching network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.309 183181 WARNING nova.compute.manager [req-fbd0700a-7d7a-433c-bf51-382d42b21638 req-d45496cf-aeb3-4c15-9a00-f319e9af7739 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Received unexpected event network-vif-plugged-324164fa-164b-418a-ba25-b2508e836f80 for instance with vm_state active and task_state migrating.
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.544 183181 WARNING neutronclient.v2_0.client [req-647747f7-2d6b-4d59-a0c1-245c2023ab31 req-0ee551ec-f071-4b97-8c9f-548a2acedd83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.767 183181 DEBUG nova.network.neutron [req-647747f7-2d6b-4d59-a0c1-245c2023ab31 req-0ee551ec-f071-4b97-8c9f-548a2acedd83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Updated VIF entry in instance network info cache for port 565fd712-31bf-40ce-ae6d-d74e1bfee02b. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 19:42:46 compute-0 nova_compute[183177]: 2026-01-26 19:42:46.768 183181 DEBUG nova.network.neutron [req-647747f7-2d6b-4d59-a0c1-245c2023ab31 req-0ee551ec-f071-4b97-8c9f-548a2acedd83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Updating instance_info_cache with network_info: [{"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:42:47 compute-0 nova_compute[183177]: 2026-01-26 19:42:47.275 183181 DEBUG oslo_concurrency.lockutils [req-647747f7-2d6b-4d59-a0c1-245c2023ab31 req-0ee551ec-f071-4b97-8c9f-548a2acedd83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:42:47 compute-0 nova_compute[183177]: 2026-01-26 19:42:47.663 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:42:48 compute-0 nova_compute[183177]: 2026-01-26 19:42:48.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:42:48 compute-0 nova_compute[183177]: 2026-01-26 19:42:48.666 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:48 compute-0 nova_compute[183177]: 2026-01-26 19:42:48.666 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:48 compute-0 nova_compute[183177]: 2026-01-26 19:42:48.667 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:48 compute-0 nova_compute[183177]: 2026-01-26 19:42:48.667 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:42:48 compute-0 nova_compute[183177]: 2026-01-26 19:42:48.982 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:49 compute-0 nova_compute[183177]: 2026-01-26 19:42:49.597 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:49 compute-0 nova_compute[183177]: 2026-01-26 19:42:49.729 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:49 compute-0 nova_compute[183177]: 2026-01-26 19:42:49.820 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:49 compute-0 nova_compute[183177]: 2026-01-26 19:42:49.821 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:49 compute-0 nova_compute[183177]: 2026-01-26 19:42:49.901 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:49 compute-0 nova_compute[183177]: 2026-01-26 19:42:49.906 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000008, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk
Jan 26 19:42:50 compute-0 nova_compute[183177]: 2026-01-26 19:42:50.133 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:42:50 compute-0 nova_compute[183177]: 2026-01-26 19:42:50.135 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:50 compute-0 nova_compute[183177]: 2026-01-26 19:42:50.175 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:50 compute-0 nova_compute[183177]: 2026-01-26 19:42:50.176 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5591MB free_disk=73.04730224609375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:42:50 compute-0 nova_compute[183177]: 2026-01-26 19:42:50.176 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:50 compute-0 nova_compute[183177]: 2026-01-26 19:42:50.177 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.201 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Migration for instance e94ea7ed-8444-4fef-a770-3fbf69f0f3dd refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.579 183181 DEBUG nova.compute.manager [req-d6cd70c6-5013-4434-b615-9d191c3b5359 req-925db0a6-ea28-4ab8-bda6-666872486768 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Received event network-vif-plugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.580 183181 DEBUG oslo_concurrency.lockutils [req-d6cd70c6-5013-4434-b615-9d191c3b5359 req-925db0a6-ea28-4ab8-bda6-666872486768 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.581 183181 DEBUG oslo_concurrency.lockutils [req-d6cd70c6-5013-4434-b615-9d191c3b5359 req-925db0a6-ea28-4ab8-bda6-666872486768 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.582 183181 DEBUG oslo_concurrency.lockutils [req-d6cd70c6-5013-4434-b615-9d191c3b5359 req-925db0a6-ea28-4ab8-bda6-666872486768 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.582 183181 DEBUG nova.compute.manager [req-d6cd70c6-5013-4434-b615-9d191c3b5359 req-925db0a6-ea28-4ab8-bda6-666872486768 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] No waiting events found dispatching network-vif-plugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.583 183181 WARNING nova.compute.manager [req-d6cd70c6-5013-4434-b615-9d191c3b5359 req-925db0a6-ea28-4ab8-bda6-666872486768 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Received unexpected event network-vif-plugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b for instance with vm_state active and task_state resize_finish.
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.737 183181 INFO nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Updating resource usage from migration 456beb37-331c-4beb-8a2c-8b5a3d0f9102
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.738 183181 INFO nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Updating resource usage from migration f7582726-b1cc-416b-b57c-278f2efd8baa
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.738 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Starting to track outgoing migration f7582726-b1cc-416b-b57c-278f2efd8baa with flavor 78406e00-3362-4102-beb8-369c301866f3 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1549
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.765 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 04802a55-668d-42ba-bc20-72c2e3f29298 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.765 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Migration 456beb37-331c-4beb-8a2c-8b5a3d0f9102 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.765 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Migration f7582726-b1cc-416b-b57c-278f2efd8baa is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.765 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.766 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:42:50 up  1:07,  0 user,  load average: 0.34, 0.36, 0.46\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_577ae27ca8cf44549308a35c420ae86d': '2', 'io_workload': '0', 'num_task_migrating': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:42:51 compute-0 nova_compute[183177]: 2026-01-26 19:42:51.855 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:42:52 compute-0 nova_compute[183177]: 2026-01-26 19:42:52.362 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:42:52 compute-0 nova_compute[183177]: 2026-01-26 19:42:52.877 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:42:52 compute-0 nova_compute[183177]: 2026-01-26 19:42:52.877 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.700s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:53 compute-0 nova_compute[183177]: 2026-01-26 19:42:53.667 183181 DEBUG nova.compute.manager [req-60cd5158-abc9-4863-8841-9941916306ba req-2f31f825-caa6-486d-9b24-623824a08739 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Received event network-vif-plugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:42:53 compute-0 nova_compute[183177]: 2026-01-26 19:42:53.668 183181 DEBUG oslo_concurrency.lockutils [req-60cd5158-abc9-4863-8841-9941916306ba req-2f31f825-caa6-486d-9b24-623824a08739 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:53 compute-0 nova_compute[183177]: 2026-01-26 19:42:53.668 183181 DEBUG oslo_concurrency.lockutils [req-60cd5158-abc9-4863-8841-9941916306ba req-2f31f825-caa6-486d-9b24-623824a08739 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:53 compute-0 nova_compute[183177]: 2026-01-26 19:42:53.669 183181 DEBUG oslo_concurrency.lockutils [req-60cd5158-abc9-4863-8841-9941916306ba req-2f31f825-caa6-486d-9b24-623824a08739 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:53 compute-0 nova_compute[183177]: 2026-01-26 19:42:53.669 183181 DEBUG nova.compute.manager [req-60cd5158-abc9-4863-8841-9941916306ba req-2f31f825-caa6-486d-9b24-623824a08739 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] No waiting events found dispatching network-vif-plugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:42:53 compute-0 nova_compute[183177]: 2026-01-26 19:42:53.669 183181 WARNING nova.compute.manager [req-60cd5158-abc9-4863-8841-9941916306ba req-2f31f825-caa6-486d-9b24-623824a08739 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Received unexpected event network-vif-plugged-565fd712-31bf-40ce-ae6d-d74e1bfee02b for instance with vm_state resized and task_state None.
Jan 26 19:42:53 compute-0 nova_compute[183177]: 2026-01-26 19:42:53.873 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:42:53 compute-0 nova_compute[183177]: 2026-01-26 19:42:53.874 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:42:53 compute-0 nova_compute[183177]: 2026-01-26 19:42:53.874 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:42:53 compute-0 nova_compute[183177]: 2026-01-26 19:42:53.875 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:42:53 compute-0 nova_compute[183177]: 2026-01-26 19:42:53.875 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:42:53 compute-0 nova_compute[183177]: 2026-01-26 19:42:53.875 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:42:53 compute-0 nova_compute[183177]: 2026-01-26 19:42:53.984 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:54 compute-0 nova_compute[183177]: 2026-01-26 19:42:54.150 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e1a70852-daf7-45b0-849d-957892f3d109-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:54 compute-0 nova_compute[183177]: 2026-01-26 19:42:54.150 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:54 compute-0 nova_compute[183177]: 2026-01-26 19:42:54.150 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e1a70852-daf7-45b0-849d-957892f3d109-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:54 compute-0 nova_compute[183177]: 2026-01-26 19:42:54.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:42:54 compute-0 podman[207105]: 2026-01-26 19:42:54.420732325 +0000 UTC m=+0.155978674 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260120, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:42:54 compute-0 nova_compute[183177]: 2026-01-26 19:42:54.598 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:54 compute-0 nova_compute[183177]: 2026-01-26 19:42:54.661 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:54 compute-0 nova_compute[183177]: 2026-01-26 19:42:54.662 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:54 compute-0 nova_compute[183177]: 2026-01-26 19:42:54.662 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:54 compute-0 nova_compute[183177]: 2026-01-26 19:42:54.663 183181 DEBUG nova.compute.resource_tracker [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:42:55 compute-0 nova_compute[183177]: 2026-01-26 19:42:55.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:42:55 compute-0 nova_compute[183177]: 2026-01-26 19:42:55.705 183181 DEBUG oslo_concurrency.processutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:55 compute-0 nova_compute[183177]: 2026-01-26 19:42:55.800 183181 DEBUG oslo_concurrency.processutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:55 compute-0 nova_compute[183177]: 2026-01-26 19:42:55.802 183181 DEBUG oslo_concurrency.processutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:55 compute-0 nova_compute[183177]: 2026-01-26 19:42:55.889 183181 DEBUG oslo_concurrency.processutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:55 compute-0 nova_compute[183177]: 2026-01-26 19:42:55.898 183181 WARNING nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Periodic task is updating the host stats, it is trying to get disk info for instance-00000008, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/e94ea7ed-8444-4fef-a770-3fbf69f0f3dd/disk
Jan 26 19:42:56 compute-0 nova_compute[183177]: 2026-01-26 19:42:56.139 183181 WARNING nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:42:56 compute-0 nova_compute[183177]: 2026-01-26 19:42:56.141 183181 DEBUG oslo_concurrency.processutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:42:56 compute-0 nova_compute[183177]: 2026-01-26 19:42:56.182 183181 DEBUG oslo_concurrency.processutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:42:56 compute-0 nova_compute[183177]: 2026-01-26 19:42:56.184 183181 DEBUG nova.compute.resource_tracker [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5596MB free_disk=73.04730224609375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:42:56 compute-0 nova_compute[183177]: 2026-01-26 19:42:56.185 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:42:56 compute-0 nova_compute[183177]: 2026-01-26 19:42:56.185 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:42:56 compute-0 nova_compute[183177]: 2026-01-26 19:42:56.661 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:42:56 compute-0 nova_compute[183177]: 2026-01-26 19:42:56.662 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 19:42:57 compute-0 nova_compute[183177]: 2026-01-26 19:42:57.207 183181 DEBUG nova.compute.resource_tracker [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance e1a70852-daf7-45b0-849d-957892f3d109 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 19:42:57 compute-0 nova_compute[183177]: 2026-01-26 19:42:57.208 183181 DEBUG nova.compute.resource_tracker [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance e94ea7ed-8444-4fef-a770-3fbf69f0f3dd refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 19:42:57 compute-0 podman[207140]: 2026-01-26 19:42:57.367582652 +0000 UTC m=+0.102842607 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, container_name=ovn_metadata_agent)
Jan 26 19:42:57 compute-0 podman[207139]: 2026-01-26 19:42:57.390740694 +0000 UTC m=+0.131286960 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 19:42:57 compute-0 nova_compute[183177]: 2026-01-26 19:42:57.718 183181 DEBUG nova.compute.resource_tracker [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 19:42:58 compute-0 nova_compute[183177]: 2026-01-26 19:42:58.228 183181 INFO nova.compute.resource_tracker [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Updating resource usage from migration f7582726-b1cc-416b-b57c-278f2efd8baa
Jan 26 19:42:58 compute-0 nova_compute[183177]: 2026-01-26 19:42:58.229 183181 DEBUG nova.compute.resource_tracker [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Starting to track outgoing migration f7582726-b1cc-416b-b57c-278f2efd8baa with flavor 78406e00-3362-4102-beb8-369c301866f3 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1549
Jan 26 19:42:58 compute-0 nova_compute[183177]: 2026-01-26 19:42:58.264 183181 DEBUG nova.compute.resource_tracker [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Instance 04802a55-668d-42ba-bc20-72c2e3f29298 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:42:58 compute-0 nova_compute[183177]: 2026-01-26 19:42:58.264 183181 DEBUG nova.compute.resource_tracker [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration 456beb37-331c-4beb-8a2c-8b5a3d0f9102 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 19:42:58 compute-0 nova_compute[183177]: 2026-01-26 19:42:58.265 183181 DEBUG nova.compute.resource_tracker [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration f7582726-b1cc-416b-b57c-278f2efd8baa is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 19:42:58 compute-0 nova_compute[183177]: 2026-01-26 19:42:58.265 183181 DEBUG nova.compute.resource_tracker [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:42:58 compute-0 nova_compute[183177]: 2026-01-26 19:42:58.266 183181 DEBUG nova.compute.resource_tracker [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:42:56 up  1:07,  0 user,  load average: 0.32, 0.35, 0.45\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_577ae27ca8cf44549308a35c420ae86d': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:42:58 compute-0 nova_compute[183177]: 2026-01-26 19:42:58.353 183181 DEBUG nova.compute.provider_tree [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:42:58 compute-0 nova_compute[183177]: 2026-01-26 19:42:58.863 183181 DEBUG nova.scheduler.client.report [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:42:58 compute-0 nova_compute[183177]: 2026-01-26 19:42:58.986 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:59 compute-0 nova_compute[183177]: 2026-01-26 19:42:59.377 183181 DEBUG nova.compute.resource_tracker [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:42:59 compute-0 nova_compute[183177]: 2026-01-26 19:42:59.377 183181 DEBUG oslo_concurrency.lockutils [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.192s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:42:59 compute-0 nova_compute[183177]: 2026-01-26 19:42:59.408 183181 INFO nova.compute.manager [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 26 19:42:59 compute-0 nova_compute[183177]: 2026-01-26 19:42:59.600 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:42:59 compute-0 podman[192499]: time="2026-01-26T19:42:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:42:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:42:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:42:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:42:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2638 "" "Go-http-client/1.1"
Jan 26 19:43:00 compute-0 nova_compute[183177]: 2026-01-26 19:43:00.505 183181 INFO nova.scheduler.client.report [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration 456beb37-331c-4beb-8a2c-8b5a3d0f9102
Jan 26 19:43:00 compute-0 nova_compute[183177]: 2026-01-26 19:43:00.506 183181 DEBUG nova.virt.libvirt.driver [None req-004845e8-fcfb-4dd3-a58c-bbff23173b10 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e1a70852-daf7-45b0-849d-957892f3d109] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 19:43:00 compute-0 nova_compute[183177]: 2026-01-26 19:43:00.837 183181 DEBUG oslo_concurrency.lockutils [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:43:00 compute-0 nova_compute[183177]: 2026-01-26 19:43:00.839 183181 DEBUG oslo_concurrency.lockutils [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:43:00 compute-0 nova_compute[183177]: 2026-01-26 19:43:00.839 183181 DEBUG nova.compute.manager [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Going to confirm migration 3 do_confirm_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:5287
Jan 26 19:43:01 compute-0 nova_compute[183177]: 2026-01-26 19:43:01.358 183181 DEBUG nova.objects.instance [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'info_cache' on Instance uuid e94ea7ed-8444-4fef-a770-3fbf69f0f3dd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:43:01 compute-0 openstack_network_exporter[195363]: ERROR   19:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:43:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:43:01 compute-0 openstack_network_exporter[195363]: ERROR   19:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:43:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:43:01 compute-0 nova_compute[183177]: 2026-01-26 19:43:01.884 183181 WARNING neutronclient.v2_0.client [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:43:02 compute-0 nova_compute[183177]: 2026-01-26 19:43:02.264 183181 WARNING neutronclient.v2_0.client [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:43:02 compute-0 nova_compute[183177]: 2026-01-26 19:43:02.265 183181 WARNING neutronclient.v2_0.client [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:43:02 compute-0 nova_compute[183177]: 2026-01-26 19:43:02.405 183181 DEBUG neutronclient.v2_0.client [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 565fd712-31bf-40ce-ae6d-d74e1bfee02b for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.12/site-packages/neutronclient/v2_0/client.py:265
Jan 26 19:43:02 compute-0 nova_compute[183177]: 2026-01-26 19:43:02.406 183181 DEBUG oslo_concurrency.lockutils [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:43:02 compute-0 nova_compute[183177]: 2026-01-26 19:43:02.406 183181 DEBUG oslo_concurrency.lockutils [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:43:02 compute-0 nova_compute[183177]: 2026-01-26 19:43:02.407 183181 DEBUG nova.network.neutron [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:43:02 compute-0 nova_compute[183177]: 2026-01-26 19:43:02.915 183181 WARNING neutronclient.v2_0.client [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:43:03 compute-0 nova_compute[183177]: 2026-01-26 19:43:03.867 183181 WARNING neutronclient.v2_0.client [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:43:04 compute-0 nova_compute[183177]: 2026-01-26 19:43:04.037 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:04 compute-0 nova_compute[183177]: 2026-01-26 19:43:04.420 183181 DEBUG nova.network.neutron [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e94ea7ed-8444-4fef-a770-3fbf69f0f3dd] Updating instance_info_cache with network_info: [{"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:43:04 compute-0 nova_compute[183177]: 2026-01-26 19:43:04.602 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:04 compute-0 nova_compute[183177]: 2026-01-26 19:43:04.926 183181 DEBUG oslo_concurrency.lockutils [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:43:04 compute-0 nova_compute[183177]: 2026-01-26 19:43:04.927 183181 DEBUG nova.objects.instance [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid e94ea7ed-8444-4fef-a770-3fbf69f0f3dd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:43:05 compute-0 podman[207177]: 2026-01-26 19:43:05.360055377 +0000 UTC m=+0.088014217 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:43:05 compute-0 nova_compute[183177]: 2026-01-26 19:43:05.433 183181 DEBUG nova.objects.base [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Object Instance<e94ea7ed-8444-4fef-a770-3fbf69f0f3dd> lazy-loaded attributes: info_cache,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 19:43:05 compute-0 nova_compute[183177]: 2026-01-26 19:43:05.452 183181 DEBUG nova.virt.libvirt.vif [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T19:41:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-266020774',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-266020774',id=8,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:42:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-96b5d3r0',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:42:52Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=e94ea7ed-8444-4fef-a770-3fbf69f0f3dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:43:05 compute-0 nova_compute[183177]: 2026-01-26 19:43:05.453 183181 DEBUG nova.network.os_vif_util [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "address": "fa:16:3e:58:3a:33", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap565fd712-31", "ovs_interfaceid": "565fd712-31bf-40ce-ae6d-d74e1bfee02b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:43:05 compute-0 nova_compute[183177]: 2026-01-26 19:43:05.454 183181 DEBUG nova.network.os_vif_util [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:58:3a:33,bridge_name='br-int',has_traffic_filtering=True,id=565fd712-31bf-40ce-ae6d-d74e1bfee02b,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap565fd712-31') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:43:05 compute-0 nova_compute[183177]: 2026-01-26 19:43:05.455 183181 DEBUG os_vif [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:3a:33,bridge_name='br-int',has_traffic_filtering=True,id=565fd712-31bf-40ce-ae6d-d74e1bfee02b,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap565fd712-31') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:43:05 compute-0 nova_compute[183177]: 2026-01-26 19:43:05.458 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:05 compute-0 nova_compute[183177]: 2026-01-26 19:43:05.458 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap565fd712-31, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:43:05 compute-0 nova_compute[183177]: 2026-01-26 19:43:05.459 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:43:05 compute-0 nova_compute[183177]: 2026-01-26 19:43:05.462 183181 INFO os_vif [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:3a:33,bridge_name='br-int',has_traffic_filtering=True,id=565fd712-31bf-40ce-ae6d-d74e1bfee02b,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap565fd712-31')
Jan 26 19:43:05 compute-0 nova_compute[183177]: 2026-01-26 19:43:05.463 183181 DEBUG oslo_concurrency.lockutils [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:43:05 compute-0 nova_compute[183177]: 2026-01-26 19:43:05.463 183181 DEBUG oslo_concurrency.lockutils [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:43:06 compute-0 nova_compute[183177]: 2026-01-26 19:43:06.073 183181 DEBUG nova.compute.provider_tree [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:43:06 compute-0 nova_compute[183177]: 2026-01-26 19:43:06.583 183181 DEBUG nova.scheduler.client.report [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:43:07 compute-0 nova_compute[183177]: 2026-01-26 19:43:07.610 183181 DEBUG oslo_concurrency.lockutils [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 2.147s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:43:08 compute-0 nova_compute[183177]: 2026-01-26 19:43:08.209 183181 INFO nova.scheduler.client.report [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration f7582726-b1cc-416b-b57c-278f2efd8baa
Jan 26 19:43:08 compute-0 nova_compute[183177]: 2026-01-26 19:43:08.720 183181 DEBUG oslo_concurrency.lockutils [None req-a2478c31-dd3d-49c9-868b-6f1d50fa938c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e94ea7ed-8444-4fef-a770-3fbf69f0f3dd" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 7.882s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:43:09 compute-0 nova_compute[183177]: 2026-01-26 19:43:09.040 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:09 compute-0 nova_compute[183177]: 2026-01-26 19:43:09.604 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:13.696 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:43:13 compute-0 nova_compute[183177]: 2026-01-26 19:43:13.697 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:13.698 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:43:14 compute-0 nova_compute[183177]: 2026-01-26 19:43:14.082 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:14 compute-0 nova_compute[183177]: 2026-01-26 19:43:14.607 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:19 compute-0 nova_compute[183177]: 2026-01-26 19:43:19.083 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:19 compute-0 nova_compute[183177]: 2026-01-26 19:43:19.610 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:19.700 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:43:23 compute-0 sshd-session[207205]: Invalid user admin from 193.32.162.151 port 55122
Jan 26 19:43:23 compute-0 sshd-session[207205]: Connection closed by invalid user admin 193.32.162.151 port 55122 [preauth]
Jan 26 19:43:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:24.041 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:43:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:24.041 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:43:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:24.042 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:43:24 compute-0 nova_compute[183177]: 2026-01-26 19:43:24.086 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:24 compute-0 nova_compute[183177]: 2026-01-26 19:43:24.611 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:25 compute-0 podman[207208]: 2026-01-26 19:43:25.407290817 +0000 UTC m=+0.143543519 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:43:28 compute-0 podman[207236]: 2026-01-26 19:43:28.358929839 +0000 UTC m=+0.099059854 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 19:43:28 compute-0 podman[207237]: 2026-01-26 19:43:28.387531649 +0000 UTC m=+0.123063380 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 19:43:29 compute-0 nova_compute[183177]: 2026-01-26 19:43:29.088 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:29 compute-0 nova_compute[183177]: 2026-01-26 19:43:29.613 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:29 compute-0 podman[192499]: time="2026-01-26T19:43:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:43:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:43:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:43:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:43:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2641 "" "Go-http-client/1.1"
Jan 26 19:43:31 compute-0 openstack_network_exporter[195363]: ERROR   19:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:43:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:43:31 compute-0 openstack_network_exporter[195363]: ERROR   19:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:43:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:43:34 compute-0 nova_compute[183177]: 2026-01-26 19:43:34.091 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:34 compute-0 nova_compute[183177]: 2026-01-26 19:43:34.616 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:36 compute-0 podman[207277]: 2026-01-26 19:43:36.317650257 +0000 UTC m=+0.067892126 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 19:43:39 compute-0 nova_compute[183177]: 2026-01-26 19:43:39.093 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:39 compute-0 nova_compute[183177]: 2026-01-26 19:43:39.618 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:44 compute-0 nova_compute[183177]: 2026-01-26 19:43:44.094 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:44 compute-0 nova_compute[183177]: 2026-01-26 19:43:44.620 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:47 compute-0 nova_compute[183177]: 2026-01-26 19:43:47.663 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:43:49 compute-0 nova_compute[183177]: 2026-01-26 19:43:49.098 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:49 compute-0 nova_compute[183177]: 2026-01-26 19:43:49.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:43:49 compute-0 nova_compute[183177]: 2026-01-26 19:43:49.622 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:49 compute-0 nova_compute[183177]: 2026-01-26 19:43:49.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:43:49 compute-0 nova_compute[183177]: 2026-01-26 19:43:49.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:43:49 compute-0 nova_compute[183177]: 2026-01-26 19:43:49.670 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:43:49 compute-0 nova_compute[183177]: 2026-01-26 19:43:49.670 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:43:50 compute-0 nova_compute[183177]: 2026-01-26 19:43:50.719 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:43:50 compute-0 nova_compute[183177]: 2026-01-26 19:43:50.804 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:43:50 compute-0 nova_compute[183177]: 2026-01-26 19:43:50.806 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:43:50 compute-0 nova_compute[183177]: 2026-01-26 19:43:50.868 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:43:51 compute-0 nova_compute[183177]: 2026-01-26 19:43:51.116 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:43:51 compute-0 nova_compute[183177]: 2026-01-26 19:43:51.118 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:43:51 compute-0 nova_compute[183177]: 2026-01-26 19:43:51.147 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:43:51 compute-0 nova_compute[183177]: 2026-01-26 19:43:51.148 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5643MB free_disk=73.07587432861328GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:43:51 compute-0 nova_compute[183177]: 2026-01-26 19:43:51.149 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:43:51 compute-0 nova_compute[183177]: 2026-01-26 19:43:51.150 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:43:51 compute-0 nova_compute[183177]: 2026-01-26 19:43:51.889 183181 DEBUG oslo_concurrency.lockutils [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:43:51 compute-0 nova_compute[183177]: 2026-01-26 19:43:51.890 183181 DEBUG oslo_concurrency.lockutils [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:43:51 compute-0 nova_compute[183177]: 2026-01-26 19:43:51.890 183181 DEBUG oslo_concurrency.lockutils [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:43:51 compute-0 nova_compute[183177]: 2026-01-26 19:43:51.891 183181 DEBUG oslo_concurrency.lockutils [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:43:51 compute-0 nova_compute[183177]: 2026-01-26 19:43:51.892 183181 DEBUG oslo_concurrency.lockutils [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:43:51 compute-0 nova_compute[183177]: 2026-01-26 19:43:51.910 183181 INFO nova.compute.manager [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Terminating instance
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.269 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 04802a55-668d-42ba-bc20-72c2e3f29298 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.269 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.270 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:43:51 up  1:08,  0 user,  load average: 0.18, 0.31, 0.43\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_577ae27ca8cf44549308a35c420ae86d': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.404 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.431 183181 DEBUG nova.compute.manager [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 19:43:52 compute-0 kernel: tap060e9277-c0 (unregistering): left promiscuous mode
Jan 26 19:43:52 compute-0 NetworkManager[55489]: <info>  [1769456632.4572] device (tap060e9277-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.471 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:52 compute-0 ovn_controller[95396]: 2026-01-26T19:43:52Z|00085|binding|INFO|Releasing lport 060e9277-c0a7-426c-af54-216da387f47d from this chassis (sb_readonly=0)
Jan 26 19:43:52 compute-0 ovn_controller[95396]: 2026-01-26T19:43:52Z|00086|binding|INFO|Setting lport 060e9277-c0a7-426c-af54-216da387f47d down in Southbound
Jan 26 19:43:52 compute-0 ovn_controller[95396]: 2026-01-26T19:43:52Z|00087|binding|INFO|Removing iface tap060e9277-c0 ovn-installed in OVS
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.474 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.483 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:71:c5 10.100.0.9'], port_security=['fa:16:3e:09:71:c5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '04802a55-668d-42ba-bc20-72c2e3f29298', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02893814-74cb-419e-9539-9a1c8c79b4be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '577ae27ca8cf44549308a35c420ae86d', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'e6bd7c8f-1625-4087-a64f-b17674d29af3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=541eb2a4-ee15-4d4c-b84c-5746be40fade, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=060e9277-c0a7-426c-af54-216da387f47d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.485 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 060e9277-c0a7-426c-af54-216da387f47d in datapath 02893814-74cb-419e-9539-9a1c8c79b4be unbound from our chassis
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.486 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 02893814-74cb-419e-9539-9a1c8c79b4be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.492 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1c8d2c64-e9d2-4c5d-81cf-dffc6fc4124e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.493 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be namespace which is not needed anymore
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.511 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:52 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 26 19:43:52 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 22.592s CPU time.
Jan 26 19:43:52 compute-0 systemd-machined[154465]: Machine qemu-4-instance-00000004 terminated.
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.663 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.674 183181 DEBUG nova.compute.manager [req-6f28e7f2-39e0-47fe-baa1-1fd8047d693e req-e9a2afbf-3613-4354-a6b2-15cfa93e51e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received event network-vif-unplugged-060e9277-c0a7-426c-af54-216da387f47d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.675 183181 DEBUG oslo_concurrency.lockutils [req-6f28e7f2-39e0-47fe-baa1-1fd8047d693e req-e9a2afbf-3613-4354-a6b2-15cfa93e51e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.676 183181 DEBUG oslo_concurrency.lockutils [req-6f28e7f2-39e0-47fe-baa1-1fd8047d693e req-e9a2afbf-3613-4354-a6b2-15cfa93e51e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.676 183181 DEBUG oslo_concurrency.lockutils [req-6f28e7f2-39e0-47fe-baa1-1fd8047d693e req-e9a2afbf-3613-4354-a6b2-15cfa93e51e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.677 183181 DEBUG nova.compute.manager [req-6f28e7f2-39e0-47fe-baa1-1fd8047d693e req-e9a2afbf-3613-4354-a6b2-15cfa93e51e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] No waiting events found dispatching network-vif-unplugged-060e9277-c0a7-426c-af54-216da387f47d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.677 183181 DEBUG nova.compute.manager [req-6f28e7f2-39e0-47fe-baa1-1fd8047d693e req-e9a2afbf-3613-4354-a6b2-15cfa93e51e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received event network-vif-unplugged-060e9277-c0a7-426c-af54-216da387f47d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.678 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:52 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[206380]: [NOTICE]   (206384) : haproxy version is 3.0.5-8e879a5
Jan 26 19:43:52 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[206380]: [NOTICE]   (206384) : path to executable is /usr/sbin/haproxy
Jan 26 19:43:52 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[206380]: [WARNING]  (206384) : Exiting Master process...
Jan 26 19:43:52 compute-0 podman[207334]: 2026-01-26 19:43:52.717417728 +0000 UTC m=+0.072237574 container kill b9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 19:43:52 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[206380]: [ALERT]    (206384) : Current worker (206386) exited with code 143 (Terminated)
Jan 26 19:43:52 compute-0 neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be[206380]: [WARNING]  (206384) : All workers exited. Exiting... (0)
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.722 183181 INFO nova.virt.libvirt.driver [-] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Instance destroyed successfully.
Jan 26 19:43:52 compute-0 systemd[1]: libpod-b9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6.scope: Deactivated successfully.
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.723 183181 DEBUG nova.objects.instance [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lazy-loading 'resources' on Instance uuid 04802a55-668d-42ba-bc20-72c2e3f29298 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:43:52 compute-0 podman[207362]: 2026-01-26 19:43:52.799832166 +0000 UTC m=+0.047937690 container died b9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Jan 26 19:43:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-22a2e47e432207fb5cdd00439037e488a42d2da2d92634960f74ae3f71fa5fef-merged.mount: Deactivated successfully.
Jan 26 19:43:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6-userdata-shm.mount: Deactivated successfully.
Jan 26 19:43:52 compute-0 podman[207362]: 2026-01-26 19:43:52.83858876 +0000 UTC m=+0.086694244 container cleanup b9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Jan 26 19:43:52 compute-0 systemd[1]: libpod-conmon-b9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6.scope: Deactivated successfully.
Jan 26 19:43:52 compute-0 podman[207364]: 2026-01-26 19:43:52.855727561 +0000 UTC m=+0.087091005 container remove b9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.864 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a785c91b-b9fc-476c-a371-acb7fd44c7ac]: (4, ("Mon Jan 26 07:43:52 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be (b9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6)\nb9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6\nMon Jan 26 07:43:52 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be (b9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6)\nb9de85450b6a1c6a9451b61ca24553f193b23391d67b5048cbf40f1ab402fdf6\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.866 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[826e0dfc-2d74-42c8-a3b1-056286bed5dd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.867 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02893814-74cb-419e-9539-9a1c8c79b4be.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.868 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[37fe948d-6722-41b7-a93f-b5f2eb709d9c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.868 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02893814-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.871 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:52 compute-0 kernel: tap02893814-70: left promiscuous mode
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.893 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.896 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f0788a-4cbc-4eab-89ff-b7622040067d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:43:52 compute-0 nova_compute[183177]: 2026-01-26 19:43:52.914 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.915 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[948b8d80-31a4-4377-ae27-8224448aebed]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.916 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7f149e-0394-492e-9db2-e75a8ce6ddaf]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.940 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[beb77ee4-f026-44ca-957c-9a070f053492]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388458, 'reachable_time': 39774, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207398, 'error': None, 'target': 'ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.944 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-02893814-74cb-419e-9539-9a1c8c79b4be deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 19:43:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:43:52.945 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[6a3a0ac8-b2cd-4e84-b8e9-1acf754577b4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:43:52 compute-0 systemd[1]: run-netns-ovnmeta\x2d02893814\x2d74cb\x2d419e\x2d9539\x2d9a1c8c79b4be.mount: Deactivated successfully.
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.232 183181 DEBUG nova.virt.libvirt.vif [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:39:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-174497542',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-174497542',id=4,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:40:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='577ae27ca8cf44549308a35c420ae86d',ramdisk_id='',reservation_id='r-xnbzkhzb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1232791976',owner_user_name='tempest-TestExecuteActionsViaActuator-1232791976-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:40:44Z,user_data=None,user_id='ee1e0029a6ac4b56b09c165dc3cd4dda',uuid=04802a55-668d-42ba-bc20-72c2e3f29298,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.233 183181 DEBUG nova.network.os_vif_util [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converting VIF {"id": "060e9277-c0a7-426c-af54-216da387f47d", "address": "fa:16:3e:09:71:c5", "network": {"id": "02893814-74cb-419e-9539-9a1c8c79b4be", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-156331889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "016b50a9944a48cc96f3b5dca58e6a4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap060e9277-c0", "ovs_interfaceid": "060e9277-c0a7-426c-af54-216da387f47d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.234 183181 DEBUG nova.network.os_vif_util [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.235 183181 DEBUG os_vif [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.238 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.238 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap060e9277-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.241 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.243 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.244 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.245 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=f819ef1f-2988-4fb3-a751-da7727c0fbeb) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.246 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.247 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.250 183181 INFO os_vif [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:71:c5,bridge_name='br-int',has_traffic_filtering=True,id=060e9277-c0a7-426c-af54-216da387f47d,network=Network(02893814-74cb-419e-9539-9a1c8c79b4be),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap060e9277-c0')
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.250 183181 INFO nova.virt.libvirt.driver [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Deleting instance files /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298_del
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.251 183181 INFO nova.virt.libvirt.driver [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Deletion of /var/lib/nova/instances/04802a55-668d-42ba-bc20-72c2e3f29298_del complete
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.424 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.425 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.275s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.769 183181 INFO nova.compute.manager [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Took 1.34 seconds to destroy the instance on the hypervisor.
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.771 183181 DEBUG oslo.service.backend._eventlet.loopingcall [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.771 183181 DEBUG nova.compute.manager [-] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.772 183181 DEBUG nova.network.neutron [-] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 19:43:53 compute-0 nova_compute[183177]: 2026-01-26 19:43:53.772 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.100 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.425 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.426 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.427 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.427 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.428 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.428 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.429 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.463 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.749 183181 DEBUG nova.compute.manager [req-2e5e5822-a0e6-41af-971b-53cd958fdd9f req-b4768b4a-ef10-477f-bda2-f5fe8bb37d3a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received event network-vif-unplugged-060e9277-c0a7-426c-af54-216da387f47d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.750 183181 DEBUG oslo_concurrency.lockutils [req-2e5e5822-a0e6-41af-971b-53cd958fdd9f req-b4768b4a-ef10-477f-bda2-f5fe8bb37d3a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.751 183181 DEBUG oslo_concurrency.lockutils [req-2e5e5822-a0e6-41af-971b-53cd958fdd9f req-b4768b4a-ef10-477f-bda2-f5fe8bb37d3a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.751 183181 DEBUG oslo_concurrency.lockutils [req-2e5e5822-a0e6-41af-971b-53cd958fdd9f req-b4768b4a-ef10-477f-bda2-f5fe8bb37d3a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.752 183181 DEBUG nova.compute.manager [req-2e5e5822-a0e6-41af-971b-53cd958fdd9f req-b4768b4a-ef10-477f-bda2-f5fe8bb37d3a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] No waiting events found dispatching network-vif-unplugged-060e9277-c0a7-426c-af54-216da387f47d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:43:54 compute-0 nova_compute[183177]: 2026-01-26 19:43:54.752 183181 DEBUG nova.compute.manager [req-2e5e5822-a0e6-41af-971b-53cd958fdd9f req-b4768b4a-ef10-477f-bda2-f5fe8bb37d3a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received event network-vif-unplugged-060e9277-c0a7-426c-af54-216da387f47d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:43:55 compute-0 nova_compute[183177]: 2026-01-26 19:43:55.580 183181 DEBUG nova.network.neutron [-] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:43:56 compute-0 nova_compute[183177]: 2026-01-26 19:43:56.089 183181 INFO nova.compute.manager [-] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Took 2.32 seconds to deallocate network for instance.
Jan 26 19:43:56 compute-0 podman[207399]: 2026-01-26 19:43:56.400790949 +0000 UTC m=+0.144070928 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 19:43:56 compute-0 nova_compute[183177]: 2026-01-26 19:43:56.620 183181 DEBUG oslo_concurrency.lockutils [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:43:56 compute-0 nova_compute[183177]: 2026-01-26 19:43:56.621 183181 DEBUG oslo_concurrency.lockutils [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:43:56 compute-0 nova_compute[183177]: 2026-01-26 19:43:56.686 183181 DEBUG nova.compute.provider_tree [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:43:56 compute-0 nova_compute[183177]: 2026-01-26 19:43:56.833 183181 DEBUG nova.compute.manager [req-e9980307-9cfa-4c98-8a55-bc246c13f9b2 req-66cd9526-05af-42cc-b9ea-71386a38db12 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 04802a55-668d-42ba-bc20-72c2e3f29298] Received event network-vif-deleted-060e9277-c0a7-426c-af54-216da387f47d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:43:57 compute-0 nova_compute[183177]: 2026-01-26 19:43:57.151 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:43:57 compute-0 nova_compute[183177]: 2026-01-26 19:43:57.198 183181 DEBUG nova.scheduler.client.report [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:43:57 compute-0 nova_compute[183177]: 2026-01-26 19:43:57.710 183181 DEBUG oslo_concurrency.lockutils [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.089s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:43:57 compute-0 nova_compute[183177]: 2026-01-26 19:43:57.735 183181 INFO nova.scheduler.client.report [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Deleted allocations for instance 04802a55-668d-42ba-bc20-72c2e3f29298
Jan 26 19:43:58 compute-0 nova_compute[183177]: 2026-01-26 19:43:58.248 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:58 compute-0 nova_compute[183177]: 2026-01-26 19:43:58.774 183181 DEBUG oslo_concurrency.lockutils [None req-3e1bca44-51bf-48a1-aee3-e44e34f3e1e1 ee1e0029a6ac4b56b09c165dc3cd4dda 577ae27ca8cf44549308a35c420ae86d - - default default] Lock "04802a55-668d-42ba-bc20-72c2e3f29298" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.884s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:43:59 compute-0 nova_compute[183177]: 2026-01-26 19:43:59.101 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:43:59 compute-0 podman[207426]: 2026-01-26 19:43:59.339255914 +0000 UTC m=+0.077008463 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 19:43:59 compute-0 podman[207425]: 2026-01-26 19:43:59.355405968 +0000 UTC m=+0.100331060 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 19:43:59 compute-0 podman[192499]: time="2026-01-26T19:43:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:43:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:43:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:43:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:43:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 26 19:44:01 compute-0 openstack_network_exporter[195363]: ERROR   19:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:44:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:44:01 compute-0 openstack_network_exporter[195363]: ERROR   19:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:44:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:44:03 compute-0 nova_compute[183177]: 2026-01-26 19:44:03.250 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:04 compute-0 nova_compute[183177]: 2026-01-26 19:44:04.105 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:04 compute-0 nova_compute[183177]: 2026-01-26 19:44:04.454 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:07 compute-0 podman[207466]: 2026-01-26 19:44:07.367684581 +0000 UTC m=+0.109290832 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 19:44:08 compute-0 nova_compute[183177]: 2026-01-26 19:44:08.252 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:09 compute-0 nova_compute[183177]: 2026-01-26 19:44:09.105 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:13 compute-0 nova_compute[183177]: 2026-01-26 19:44:13.254 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:14 compute-0 nova_compute[183177]: 2026-01-26 19:44:14.109 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:15 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:15.731 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:44:15 compute-0 nova_compute[183177]: 2026-01-26 19:44:15.731 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:15 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:15.732 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:44:18 compute-0 nova_compute[183177]: 2026-01-26 19:44:18.257 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:18 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:18.536 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:bd:84 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adee17f8a0194f5eb330b178ca303941', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b55a762-8a6e-494a-8fe1-e2f6fe7cb4f9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a249be39-6b4f-40bc-acd2-fb81baa61f02) old=Port_Binding(mac=['fa:16:3e:1f:bd:84'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adee17f8a0194f5eb330b178ca303941', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:44:18 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:18.538 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a249be39-6b4f-40bc-acd2-fb81baa61f02 in datapath 0501a2fa-d9aa-43e0-a6c7-2ea169228252 updated
Jan 26 19:44:18 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:18.539 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0501a2fa-d9aa-43e0-a6c7-2ea169228252, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:44:18 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:18.542 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[4d183889-c011-4a2e-8646-d51999bea185]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:19 compute-0 nova_compute[183177]: 2026-01-26 19:44:19.111 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:20 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:20.733 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:44:23 compute-0 nova_compute[183177]: 2026-01-26 19:44:23.259 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:24.043 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:44:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:24.044 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:44:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:24.044 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:44:24 compute-0 nova_compute[183177]: 2026-01-26 19:44:24.114 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:27 compute-0 podman[207493]: 2026-01-26 19:44:27.404409604 +0000 UTC m=+0.145008893 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:44:28 compute-0 nova_compute[183177]: 2026-01-26 19:44:28.261 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:29 compute-0 nova_compute[183177]: 2026-01-26 19:44:29.159 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:29.534 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:04:eb 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-db8df7a1-46d9-438f-a7b3-f710eaa774b1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db8df7a1-46d9-438f-a7b3-f710eaa774b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '315dd5c96f24487b9b621d7237bc35ed', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27bcc2d7-8ae4-4403-93e8-2e6e8173a2f1, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=72df3c12-d8e1-4914-9ce1-1237665e5a3b) old=Port_Binding(mac=['fa:16:3e:4b:04:eb'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-db8df7a1-46d9-438f-a7b3-f710eaa774b1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db8df7a1-46d9-438f-a7b3-f710eaa774b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '315dd5c96f24487b9b621d7237bc35ed', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:44:30 compute-0 podman[192499]: time="2026-01-26T19:44:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:44:30 compute-0 podman[192499]: @ - - [26/Jan/2026:19:44:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:44:30 compute-0 podman[192499]: @ - - [26/Jan/2026:19:44:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Jan 26 19:44:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:29.535 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 72df3c12-d8e1-4914-9ce1-1237665e5a3b in datapath db8df7a1-46d9-438f-a7b3-f710eaa774b1 updated
Jan 26 19:44:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:29.537 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network db8df7a1-46d9-438f-a7b3-f710eaa774b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:44:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:30.317 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[ba7908d0-9f80-498f-b222-cc8ba79a54e5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:30 compute-0 podman[207522]: 2026-01-26 19:44:30.436688813 +0000 UTC m=+0.089037157 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Jan 26 19:44:30 compute-0 podman[207523]: 2026-01-26 19:44:30.439998772 +0000 UTC m=+0.082197652 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:44:31 compute-0 openstack_network_exporter[195363]: ERROR   19:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:44:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:44:31 compute-0 openstack_network_exporter[195363]: ERROR   19:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:44:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:44:33 compute-0 nova_compute[183177]: 2026-01-26 19:44:33.264 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:34 compute-0 nova_compute[183177]: 2026-01-26 19:44:34.160 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:38 compute-0 nova_compute[183177]: 2026-01-26 19:44:38.266 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:38 compute-0 podman[207556]: 2026-01-26 19:44:38.326630394 +0000 UTC m=+0.072409380 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:44:38 compute-0 nova_compute[183177]: 2026-01-26 19:44:38.897 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquiring lock "388082f6-0664-4bc2-844f-e9545548138b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:44:38 compute-0 nova_compute[183177]: 2026-01-26 19:44:38.897 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:44:39 compute-0 nova_compute[183177]: 2026-01-26 19:44:39.164 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:39 compute-0 nova_compute[183177]: 2026-01-26 19:44:39.404 183181 DEBUG nova.compute.manager [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 19:44:39 compute-0 ovn_controller[95396]: 2026-01-26T19:44:39Z|00088|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 19:44:39 compute-0 nova_compute[183177]: 2026-01-26 19:44:39.979 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:44:39 compute-0 nova_compute[183177]: 2026-01-26 19:44:39.979 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:44:39 compute-0 nova_compute[183177]: 2026-01-26 19:44:39.989 183181 DEBUG nova.virt.hardware [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 19:44:39 compute-0 nova_compute[183177]: 2026-01-26 19:44:39.990 183181 INFO nova.compute.claims [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Claim successful on node compute-0.ctlplane.example.com
Jan 26 19:44:41 compute-0 nova_compute[183177]: 2026-01-26 19:44:41.148 183181 DEBUG nova.compute.provider_tree [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:44:41 compute-0 nova_compute[183177]: 2026-01-26 19:44:41.657 183181 DEBUG nova.scheduler.client.report [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:44:42 compute-0 nova_compute[183177]: 2026-01-26 19:44:42.168 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.189s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:44:42 compute-0 nova_compute[183177]: 2026-01-26 19:44:42.170 183181 DEBUG nova.compute.manager [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 19:44:42 compute-0 nova_compute[183177]: 2026-01-26 19:44:42.684 183181 DEBUG nova.compute.manager [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 19:44:42 compute-0 nova_compute[183177]: 2026-01-26 19:44:42.685 183181 DEBUG nova.network.neutron [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 19:44:42 compute-0 nova_compute[183177]: 2026-01-26 19:44:42.685 183181 WARNING neutronclient.v2_0.client [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:44:42 compute-0 nova_compute[183177]: 2026-01-26 19:44:42.686 183181 WARNING neutronclient.v2_0.client [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:44:43 compute-0 nova_compute[183177]: 2026-01-26 19:44:43.193 183181 INFO nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 19:44:43 compute-0 nova_compute[183177]: 2026-01-26 19:44:43.268 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:43 compute-0 nova_compute[183177]: 2026-01-26 19:44:43.704 183181 DEBUG nova.compute.manager [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.166 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.646 183181 DEBUG nova.network.neutron [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Successfully created port: 360af8e5-9c2d-468f-a325-ced8745d90f1 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.721 183181 DEBUG nova.compute.manager [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.723 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.724 183181 INFO nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Creating image(s)
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.725 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquiring lock "/var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.726 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "/var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.727 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "/var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.729 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.735 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.739 183181 DEBUG oslo_concurrency.processutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.829 183181 DEBUG oslo_concurrency.processutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.830 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.831 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.832 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.839 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.839 183181 DEBUG oslo_concurrency.processutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.919 183181 DEBUG oslo_concurrency.processutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.921 183181 DEBUG oslo_concurrency.processutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.976 183181 DEBUG oslo_concurrency.processutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.978 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:44:44 compute-0 nova_compute[183177]: 2026-01-26 19:44:44.979 183181 DEBUG oslo_concurrency.processutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.035 183181 DEBUG oslo_concurrency.processutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.036 183181 DEBUG nova.virt.disk.api [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Checking if we can resize image /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.036 183181 DEBUG oslo_concurrency.processutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.131 183181 DEBUG oslo_concurrency.processutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.132 183181 DEBUG nova.virt.disk.api [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Cannot resize image /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.133 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.133 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Ensure instance console log exists: /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.134 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.135 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.135 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.701 183181 DEBUG nova.network.neutron [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Successfully updated port: 360af8e5-9c2d-468f-a325-ced8745d90f1 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.786 183181 DEBUG nova.compute.manager [req-325472fb-a7f6-478e-9b7b-b5867850090f req-127b2d7c-24a6-4d47-93ad-f1be51c1f296 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Received event network-changed-360af8e5-9c2d-468f-a325-ced8745d90f1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.786 183181 DEBUG nova.compute.manager [req-325472fb-a7f6-478e-9b7b-b5867850090f req-127b2d7c-24a6-4d47-93ad-f1be51c1f296 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Refreshing instance network info cache due to event network-changed-360af8e5-9c2d-468f-a325-ced8745d90f1. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.787 183181 DEBUG oslo_concurrency.lockutils [req-325472fb-a7f6-478e-9b7b-b5867850090f req-127b2d7c-24a6-4d47-93ad-f1be51c1f296 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-388082f6-0664-4bc2-844f-e9545548138b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.787 183181 DEBUG oslo_concurrency.lockutils [req-325472fb-a7f6-478e-9b7b-b5867850090f req-127b2d7c-24a6-4d47-93ad-f1be51c1f296 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-388082f6-0664-4bc2-844f-e9545548138b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:44:45 compute-0 nova_compute[183177]: 2026-01-26 19:44:45.788 183181 DEBUG nova.network.neutron [req-325472fb-a7f6-478e-9b7b-b5867850090f req-127b2d7c-24a6-4d47-93ad-f1be51c1f296 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Refreshing network info cache for port 360af8e5-9c2d-468f-a325-ced8745d90f1 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:44:46 compute-0 nova_compute[183177]: 2026-01-26 19:44:46.209 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquiring lock "refresh_cache-388082f6-0664-4bc2-844f-e9545548138b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:44:46 compute-0 nova_compute[183177]: 2026-01-26 19:44:46.294 183181 WARNING neutronclient.v2_0.client [req-325472fb-a7f6-478e-9b7b-b5867850090f req-127b2d7c-24a6-4d47-93ad-f1be51c1f296 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:44:46 compute-0 nova_compute[183177]: 2026-01-26 19:44:46.464 183181 DEBUG nova.network.neutron [req-325472fb-a7f6-478e-9b7b-b5867850090f req-127b2d7c-24a6-4d47-93ad-f1be51c1f296 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:44:46 compute-0 nova_compute[183177]: 2026-01-26 19:44:46.620 183181 DEBUG nova.network.neutron [req-325472fb-a7f6-478e-9b7b-b5867850090f req-127b2d7c-24a6-4d47-93ad-f1be51c1f296 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:44:47 compute-0 nova_compute[183177]: 2026-01-26 19:44:47.125 183181 DEBUG oslo_concurrency.lockutils [req-325472fb-a7f6-478e-9b7b-b5867850090f req-127b2d7c-24a6-4d47-93ad-f1be51c1f296 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-388082f6-0664-4bc2-844f-e9545548138b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:44:47 compute-0 nova_compute[183177]: 2026-01-26 19:44:47.126 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquired lock "refresh_cache-388082f6-0664-4bc2-844f-e9545548138b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:44:47 compute-0 nova_compute[183177]: 2026-01-26 19:44:47.126 183181 DEBUG nova.network.neutron [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:44:47 compute-0 nova_compute[183177]: 2026-01-26 19:44:47.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:44:48 compute-0 nova_compute[183177]: 2026-01-26 19:44:48.270 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:48 compute-0 nova_compute[183177]: 2026-01-26 19:44:48.454 183181 DEBUG nova.network.neutron [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:44:49 compute-0 nova_compute[183177]: 2026-01-26 19:44:49.202 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:49 compute-0 nova_compute[183177]: 2026-01-26 19:44:49.554 183181 WARNING neutronclient.v2_0.client [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:44:50 compute-0 nova_compute[183177]: 2026-01-26 19:44:50.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:44:50 compute-0 nova_compute[183177]: 2026-01-26 19:44:50.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:44:50 compute-0 nova_compute[183177]: 2026-01-26 19:44:50.637 183181 DEBUG nova.network.neutron [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Updating instance_info_cache with network_info: [{"id": "360af8e5-9c2d-468f-a325-ced8745d90f1", "address": "fa:16:3e:e5:9b:79", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap360af8e5-9c", "ovs_interfaceid": "360af8e5-9c2d-468f-a325-ced8745d90f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:44:50 compute-0 nova_compute[183177]: 2026-01-26 19:44:50.666 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:44:50 compute-0 nova_compute[183177]: 2026-01-26 19:44:50.667 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:44:50 compute-0 nova_compute[183177]: 2026-01-26 19:44:50.667 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:44:50 compute-0 nova_compute[183177]: 2026-01-26 19:44:50.667 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:44:50 compute-0 nova_compute[183177]: 2026-01-26 19:44:50.917 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:44:50 compute-0 nova_compute[183177]: 2026-01-26 19:44:50.919 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:44:50 compute-0 nova_compute[183177]: 2026-01-26 19:44:50.947 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:44:50 compute-0 nova_compute[183177]: 2026-01-26 19:44:50.948 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5813MB free_disk=73.10287094116211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:44:50 compute-0 nova_compute[183177]: 2026-01-26 19:44:50.949 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:44:50 compute-0 nova_compute[183177]: 2026-01-26 19:44:50.949 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.147 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Releasing lock "refresh_cache-388082f6-0664-4bc2-844f-e9545548138b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.148 183181 DEBUG nova.compute.manager [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Instance network_info: |[{"id": "360af8e5-9c2d-468f-a325-ced8745d90f1", "address": "fa:16:3e:e5:9b:79", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap360af8e5-9c", "ovs_interfaceid": "360af8e5-9c2d-468f-a325-ced8745d90f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.152 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Start _get_guest_xml network_info=[{"id": "360af8e5-9c2d-468f-a325-ced8745d90f1", "address": "fa:16:3e:e5:9b:79", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap360af8e5-9c", "ovs_interfaceid": "360af8e5-9c2d-468f-a325-ced8745d90f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.157 183181 WARNING nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.159 183181 DEBUG nova.virt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-1795175879', uuid='388082f6-0664-4bc2-844f-e9545548138b'), owner=OwnerMeta(userid='2f579cdd19584229bbf4f240effa28f3', username='tempest-TestExecuteBasicStrategy-1074421461-project-admin', projectid='315dd5c96f24487b9b621d7237bc35ed', projectname='tempest-TestExecuteBasicStrategy-1074421461'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "360af8e5-9c2d-468f-a325-ced8745d90f1", "address": "fa:16:3e:e5:9b:79", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap360af8e5-9c", "ovs_interfaceid": "360af8e5-9c2d-468f-a325-ced8745d90f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769456691.1597402) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.166 183181 DEBUG nova.virt.libvirt.host [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.167 183181 DEBUG nova.virt.libvirt.host [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.171 183181 DEBUG nova.virt.libvirt.host [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.172 183181 DEBUG nova.virt.libvirt.host [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.174 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.174 183181 DEBUG nova.virt.hardware [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.175 183181 DEBUG nova.virt.hardware [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.176 183181 DEBUG nova.virt.hardware [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.176 183181 DEBUG nova.virt.hardware [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.177 183181 DEBUG nova.virt.hardware [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.177 183181 DEBUG nova.virt.hardware [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.177 183181 DEBUG nova.virt.hardware [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.178 183181 DEBUG nova.virt.hardware [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.178 183181 DEBUG nova.virt.hardware [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.179 183181 DEBUG nova.virt.hardware [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.179 183181 DEBUG nova.virt.hardware [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.186 183181 DEBUG nova.virt.libvirt.vif [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1795175879',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1795175879',id=10,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='315dd5c96f24487b9b621d7237bc35ed',ramdisk_id='',reservation_id='r-wvrrsxk9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-1074421461',owner_user_name='tempest-TestExecuteBasicStrategy-1074421461-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:44:43Z,user_data=None,user_id='2f579cdd19584229bbf4f240effa28f3',uuid=388082f6-0664-4bc2-844f-e9545548138b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "360af8e5-9c2d-468f-a325-ced8745d90f1", "address": "fa:16:3e:e5:9b:79", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap360af8e5-9c", "ovs_interfaceid": "360af8e5-9c2d-468f-a325-ced8745d90f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.187 183181 DEBUG nova.network.os_vif_util [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Converting VIF {"id": "360af8e5-9c2d-468f-a325-ced8745d90f1", "address": "fa:16:3e:e5:9b:79", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap360af8e5-9c", "ovs_interfaceid": "360af8e5-9c2d-468f-a325-ced8745d90f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.188 183181 DEBUG nova.network.os_vif_util [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:9b:79,bridge_name='br-int',has_traffic_filtering=True,id=360af8e5-9c2d-468f-a325-ced8745d90f1,network=Network(0501a2fa-d9aa-43e0-a6c7-2ea169228252),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap360af8e5-9c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.190 183181 DEBUG nova.objects.instance [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lazy-loading 'pci_devices' on Instance uuid 388082f6-0664-4bc2-844f-e9545548138b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.699 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] End _get_guest_xml xml=<domain type="kvm">
Jan 26 19:44:51 compute-0 nova_compute[183177]:   <uuid>388082f6-0664-4bc2-844f-e9545548138b</uuid>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   <name>instance-0000000a</name>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1795175879</nova:name>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:44:51</nova:creationTime>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:44:51 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:44:51 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:user uuid="2f579cdd19584229bbf4f240effa28f3">tempest-TestExecuteBasicStrategy-1074421461-project-admin</nova:user>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:project uuid="315dd5c96f24487b9b621d7237bc35ed">tempest-TestExecuteBasicStrategy-1074421461</nova:project>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         <nova:port uuid="360af8e5-9c2d-468f-a325-ced8745d90f1">
Jan 26 19:44:51 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <system>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <entry name="serial">388082f6-0664-4bc2-844f-e9545548138b</entry>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <entry name="uuid">388082f6-0664-4bc2-844f-e9545548138b</entry>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     </system>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   <os>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   </os>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   <features>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   </features>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk.config"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:e5:9b:79"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <target dev="tap360af8e5-9c"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/console.log" append="off"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <video>
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     </video>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:44:51 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:44:51 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:44:51 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:44:51 compute-0 nova_compute[183177]: </domain>
Jan 26 19:44:51 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.700 183181 DEBUG nova.compute.manager [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Preparing to wait for external event network-vif-plugged-360af8e5-9c2d-468f-a325-ced8745d90f1 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.700 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquiring lock "388082f6-0664-4bc2-844f-e9545548138b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.701 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.701 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.702 183181 DEBUG nova.virt.libvirt.vif [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1795175879',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1795175879',id=10,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='315dd5c96f24487b9b621d7237bc35ed',ramdisk_id='',reservation_id='r-wvrrsxk9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-1074421461',owner_user_name='tempest-TestExecuteBasicStrategy-1074421461-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:44:43Z,user_data=None,user_id='2f579cdd19584229bbf4f240effa28f3',uuid=388082f6-0664-4bc2-844f-e9545548138b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "360af8e5-9c2d-468f-a325-ced8745d90f1", "address": "fa:16:3e:e5:9b:79", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap360af8e5-9c", "ovs_interfaceid": "360af8e5-9c2d-468f-a325-ced8745d90f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.702 183181 DEBUG nova.network.os_vif_util [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Converting VIF {"id": "360af8e5-9c2d-468f-a325-ced8745d90f1", "address": "fa:16:3e:e5:9b:79", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap360af8e5-9c", "ovs_interfaceid": "360af8e5-9c2d-468f-a325-ced8745d90f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.703 183181 DEBUG nova.network.os_vif_util [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:9b:79,bridge_name='br-int',has_traffic_filtering=True,id=360af8e5-9c2d-468f-a325-ced8745d90f1,network=Network(0501a2fa-d9aa-43e0-a6c7-2ea169228252),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap360af8e5-9c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.704 183181 DEBUG os_vif [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:9b:79,bridge_name='br-int',has_traffic_filtering=True,id=360af8e5-9c2d-468f-a325-ced8745d90f1,network=Network(0501a2fa-d9aa-43e0-a6c7-2ea169228252),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap360af8e5-9c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.704 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.705 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.706 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.707 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.707 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'f9abe490-366a-540e-8e98-a93f783ef4e5', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.741 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.743 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.746 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.747 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap360af8e5-9c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.748 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap360af8e5-9c, col_values=(('qos', UUID('d2a2bf38-d1e0-4587-8feb-baed6e347f39')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.748 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap360af8e5-9c, col_values=(('external_ids', {'iface-id': '360af8e5-9c2d-468f-a325-ced8745d90f1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:9b:79', 'vm-uuid': '388082f6-0664-4bc2-844f-e9545548138b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.750 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:51 compute-0 NetworkManager[55489]: <info>  [1769456691.7517] manager: (tap360af8e5-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.753 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.758 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:51 compute-0 nova_compute[183177]: 2026-01-26 19:44:51.759 183181 INFO os_vif [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:9b:79,bridge_name='br-int',has_traffic_filtering=True,id=360af8e5-9c2d-468f-a325-ced8745d90f1,network=Network(0501a2fa-d9aa-43e0-a6c7-2ea169228252),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap360af8e5-9c')
Jan 26 19:44:52 compute-0 nova_compute[183177]: 2026-01-26 19:44:52.024 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 388082f6-0664-4bc2-844f-e9545548138b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:44:52 compute-0 nova_compute[183177]: 2026-01-26 19:44:52.025 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:44:52 compute-0 nova_compute[183177]: 2026-01-26 19:44:52.025 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:44:50 up  1:09,  0 user,  load average: 0.10, 0.26, 0.41\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_315dd5c96f24487b9b621d7237bc35ed': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:44:52 compute-0 nova_compute[183177]: 2026-01-26 19:44:52.063 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:44:52 compute-0 nova_compute[183177]: 2026-01-26 19:44:52.571 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:44:53 compute-0 nova_compute[183177]: 2026-01-26 19:44:53.084 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:44:53 compute-0 nova_compute[183177]: 2026-01-26 19:44:53.084 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.135s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:44:53 compute-0 nova_compute[183177]: 2026-01-26 19:44:53.307 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:44:53 compute-0 nova_compute[183177]: 2026-01-26 19:44:53.308 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:44:53 compute-0 nova_compute[183177]: 2026-01-26 19:44:53.308 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] No VIF found with MAC fa:16:3e:e5:9b:79, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 19:44:53 compute-0 nova_compute[183177]: 2026-01-26 19:44:53.309 183181 INFO nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Using config drive
Jan 26 19:44:53 compute-0 nova_compute[183177]: 2026-01-26 19:44:53.820 183181 WARNING neutronclient.v2_0.client [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.011 183181 INFO nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Creating config drive at /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk.config
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.017 183181 DEBUG oslo_concurrency.processutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpyi8qizrm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.162 183181 DEBUG oslo_concurrency.processutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpyi8qizrm" returned: 0 in 0.145s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.204 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:54 compute-0 kernel: tap360af8e5-9c: entered promiscuous mode
Jan 26 19:44:54 compute-0 NetworkManager[55489]: <info>  [1769456694.2294] manager: (tap360af8e5-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/41)
Jan 26 19:44:54 compute-0 ovn_controller[95396]: 2026-01-26T19:44:54Z|00089|binding|INFO|Claiming lport 360af8e5-9c2d-468f-a325-ced8745d90f1 for this chassis.
Jan 26 19:44:54 compute-0 ovn_controller[95396]: 2026-01-26T19:44:54Z|00090|binding|INFO|360af8e5-9c2d-468f-a325-ced8745d90f1: Claiming fa:16:3e:e5:9b:79 10.100.0.6
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.229 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.238 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.247 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:9b:79 10.100.0.6'], port_security=['fa:16:3e:e5:9b:79 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '388082f6-0664-4bc2-844f-e9545548138b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '315dd5c96f24487b9b621d7237bc35ed', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0b324a45-9260-4a52-b269-ee106dc2cc6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b55a762-8a6e-494a-8fe1-e2f6fe7cb4f9, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=360af8e5-9c2d-468f-a325-ced8745d90f1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.248 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 360af8e5-9c2d-468f-a325-ced8745d90f1 in datapath 0501a2fa-d9aa-43e0-a6c7-2ea169228252 bound to our chassis
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.249 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0501a2fa-d9aa-43e0-a6c7-2ea169228252
Jan 26 19:44:54 compute-0 systemd-udevd[207617]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.266 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[c4859f23-b1f1-4a0a-92be-b8848aa7f06d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.268 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0501a2fa-d1 in ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.270 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0501a2fa-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.270 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f0d96c11-9bde-4fb7-b97b-d301ba37d0d6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.271 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6f4a9679-e264-4760-9fa3-ff6a3849bb00]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 systemd-machined[154465]: New machine qemu-7-instance-0000000a.
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.284 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[b7c7e561-754b-4fee-8169-3782cfcd7665]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 NetworkManager[55489]: <info>  [1769456694.2855] device (tap360af8e5-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:44:54 compute-0 NetworkManager[55489]: <info>  [1769456694.2862] device (tap360af8e5-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.316 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1b6d41e3-99d4-4f02-ab84-260d599d014b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-0000000a.
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.323 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:54 compute-0 ovn_controller[95396]: 2026-01-26T19:44:54Z|00091|binding|INFO|Setting lport 360af8e5-9c2d-468f-a325-ced8745d90f1 ovn-installed in OVS
Jan 26 19:44:54 compute-0 ovn_controller[95396]: 2026-01-26T19:44:54Z|00092|binding|INFO|Setting lport 360af8e5-9c2d-468f-a325-ced8745d90f1 up in Southbound
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.327 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.379 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[d27b9a63-ca30-4aa1-aa55-4c086a693737]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.384 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[837d5e43-402a-48e9-b2fe-bc4e145082e5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 NetworkManager[55489]: <info>  [1769456694.3868] manager: (tap0501a2fa-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/42)
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.428 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[d5320fa8-a490-49a1-a4ad-2110ea01c195]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.433 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[f580b47a-0861-4da7-963c-4bc3ba2c7cc3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 NetworkManager[55489]: <info>  [1769456694.4625] device (tap0501a2fa-d0): carrier: link connected
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.474 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[a04749fe-db81-452f-89b4-d323ef5a3e4d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.494 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6596c803-b433-4701-a28c-685ffa4c987a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0501a2fa-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:bd:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415334, 'reachable_time': 31502, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207650, 'error': None, 'target': 'ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.519 183181 DEBUG nova.compute.manager [req-330b64f4-3fb7-451b-9aa1-5f47d2ddd412 req-7d84a621-1f4b-4825-8237-a9bd21886403 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Received event network-vif-plugged-360af8e5-9c2d-468f-a325-ced8745d90f1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.519 183181 DEBUG oslo_concurrency.lockutils [req-330b64f4-3fb7-451b-9aa1-5f47d2ddd412 req-7d84a621-1f4b-4825-8237-a9bd21886403 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "388082f6-0664-4bc2-844f-e9545548138b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.519 183181 DEBUG oslo_concurrency.lockutils [req-330b64f4-3fb7-451b-9aa1-5f47d2ddd412 req-7d84a621-1f4b-4825-8237-a9bd21886403 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.519 183181 DEBUG oslo_concurrency.lockutils [req-330b64f4-3fb7-451b-9aa1-5f47d2ddd412 req-7d84a621-1f4b-4825-8237-a9bd21886403 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.520 183181 DEBUG nova.compute.manager [req-330b64f4-3fb7-451b-9aa1-5f47d2ddd412 req-7d84a621-1f4b-4825-8237-a9bd21886403 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Processing event network-vif-plugged-360af8e5-9c2d-468f-a325-ced8745d90f1 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.522 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[21e02d7e-efab-4a99-95d8-929d9f808331]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1f:bd84'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415334, 'tstamp': 415334}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207651, 'error': None, 'target': 'ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.541 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[95204409-a072-44ac-b4c4-594bd3ef50a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0501a2fa-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:bd:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415334, 'reachable_time': 31502, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 207652, 'error': None, 'target': 'ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.582 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c0e4a5-e0a7-45b3-be94-cfb1826b8f4c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.675 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[c5ff3c1b-cebd-4207-8d56-30878fa493e6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.676 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0501a2fa-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.676 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.677 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0501a2fa-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:44:54 compute-0 kernel: tap0501a2fa-d0: entered promiscuous mode
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.678 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:54 compute-0 NetworkManager[55489]: <info>  [1769456694.6798] manager: (tap0501a2fa-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.681 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.682 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0501a2fa-d0, col_values=(('external_ids', {'iface-id': 'a249be39-6b4f-40bc-acd2-fb81baa61f02'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.683 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:54 compute-0 ovn_controller[95396]: 2026-01-26T19:44:54Z|00093|binding|INFO|Releasing lport a249be39-6b4f-40bc-acd2-fb81baa61f02 from this chassis (sb_readonly=0)
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.706 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.707 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.708 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9afc7beb-f1c9-40aa-9228-1686112daf54]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.709 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0501a2fa-d9aa-43e0-a6c7-2ea169228252.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0501a2fa-d9aa-43e0-a6c7-2ea169228252.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.710 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0501a2fa-d9aa-43e0-a6c7-2ea169228252.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0501a2fa-d9aa-43e0-a6c7-2ea169228252.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.710 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 0501a2fa-d9aa-43e0-a6c7-2ea169228252 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.710 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0501a2fa-d9aa-43e0-a6c7-2ea169228252.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0501a2fa-d9aa-43e0-a6c7-2ea169228252.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.711 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5ecce8-3f84-4b6e-8269-1ce9c86d7b14]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.711 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0501a2fa-d9aa-43e0-a6c7-2ea169228252.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0501a2fa-d9aa-43e0-a6c7-2ea169228252.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.712 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3f1bf1dc-9e9c-4c60-885e-80231a9827d0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.712 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: global
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-0501a2fa-d9aa-43e0-a6c7-2ea169228252
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/0501a2fa-d9aa-43e0-a6c7-2ea169228252.pid.haproxy
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID 0501a2fa-d9aa-43e0-a6c7-2ea169228252
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 19:44:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:44:54.714 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'env', 'PROCESS_TAG=haproxy-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0501a2fa-d9aa-43e0-a6c7-2ea169228252.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.733 183181 DEBUG nova.compute.manager [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.742 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.746 183181 INFO nova.virt.libvirt.driver [-] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Instance spawned successfully.
Jan 26 19:44:54 compute-0 nova_compute[183177]: 2026-01-26 19:44:54.747 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 19:44:55 compute-0 nova_compute[183177]: 2026-01-26 19:44:55.080 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:44:55 compute-0 nova_compute[183177]: 2026-01-26 19:44:55.081 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:44:55 compute-0 nova_compute[183177]: 2026-01-26 19:44:55.081 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:44:55 compute-0 nova_compute[183177]: 2026-01-26 19:44:55.081 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:44:55 compute-0 nova_compute[183177]: 2026-01-26 19:44:55.081 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:44:55 compute-0 podman[207691]: 2026-01-26 19:44:55.154522007 +0000 UTC m=+0.081899565 container create c94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252, org.label-schema.build-date=20260120, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:44:55 compute-0 podman[207691]: 2026-01-26 19:44:55.114446158 +0000 UTC m=+0.041823776 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 19:44:55 compute-0 systemd[1]: Started libpod-conmon-c94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8.scope.
Jan 26 19:44:55 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:44:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d597fea71c6252501bcffe3ab981a04ff7f02c71a45bd992a16c25dc434448d6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 19:44:55 compute-0 nova_compute[183177]: 2026-01-26 19:44:55.262 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:44:55 compute-0 nova_compute[183177]: 2026-01-26 19:44:55.263 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:44:55 compute-0 nova_compute[183177]: 2026-01-26 19:44:55.263 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:44:55 compute-0 nova_compute[183177]: 2026-01-26 19:44:55.264 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:44:55 compute-0 nova_compute[183177]: 2026-01-26 19:44:55.264 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:44:55 compute-0 nova_compute[183177]: 2026-01-26 19:44:55.264 183181 DEBUG nova.virt.libvirt.driver [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:44:55 compute-0 podman[207691]: 2026-01-26 19:44:55.275405329 +0000 UTC m=+0.202782877 container init c94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120)
Jan 26 19:44:55 compute-0 podman[207691]: 2026-01-26 19:44:55.28586718 +0000 UTC m=+0.213244708 container start c94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Jan 26 19:44:55 compute-0 neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252[207707]: [NOTICE]   (207711) : New worker (207713) forked
Jan 26 19:44:55 compute-0 neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252[207707]: [NOTICE]   (207711) : Loading success.
Jan 26 19:44:55 compute-0 nova_compute[183177]: 2026-01-26 19:44:55.774 183181 INFO nova.compute.manager [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Took 11.05 seconds to spawn the instance on the hypervisor.
Jan 26 19:44:55 compute-0 nova_compute[183177]: 2026-01-26 19:44:55.774 183181 DEBUG nova.compute.manager [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 19:44:56 compute-0 nova_compute[183177]: 2026-01-26 19:44:56.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:44:56 compute-0 nova_compute[183177]: 2026-01-26 19:44:56.322 183181 INFO nova.compute.manager [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Took 16.40 seconds to build instance.
Jan 26 19:44:56 compute-0 nova_compute[183177]: 2026-01-26 19:44:56.597 183181 DEBUG nova.compute.manager [req-b7868f19-e552-4075-91f3-571cca6a3c79 req-fd54a931-f906-4be8-b17b-3754aac7de64 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Received event network-vif-plugged-360af8e5-9c2d-468f-a325-ced8745d90f1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:44:56 compute-0 nova_compute[183177]: 2026-01-26 19:44:56.598 183181 DEBUG oslo_concurrency.lockutils [req-b7868f19-e552-4075-91f3-571cca6a3c79 req-fd54a931-f906-4be8-b17b-3754aac7de64 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "388082f6-0664-4bc2-844f-e9545548138b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:44:56 compute-0 nova_compute[183177]: 2026-01-26 19:44:56.598 183181 DEBUG oslo_concurrency.lockutils [req-b7868f19-e552-4075-91f3-571cca6a3c79 req-fd54a931-f906-4be8-b17b-3754aac7de64 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:44:56 compute-0 nova_compute[183177]: 2026-01-26 19:44:56.599 183181 DEBUG oslo_concurrency.lockutils [req-b7868f19-e552-4075-91f3-571cca6a3c79 req-fd54a931-f906-4be8-b17b-3754aac7de64 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:44:56 compute-0 nova_compute[183177]: 2026-01-26 19:44:56.599 183181 DEBUG nova.compute.manager [req-b7868f19-e552-4075-91f3-571cca6a3c79 req-fd54a931-f906-4be8-b17b-3754aac7de64 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] No waiting events found dispatching network-vif-plugged-360af8e5-9c2d-468f-a325-ced8745d90f1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:44:56 compute-0 nova_compute[183177]: 2026-01-26 19:44:56.600 183181 WARNING nova.compute.manager [req-b7868f19-e552-4075-91f3-571cca6a3c79 req-fd54a931-f906-4be8-b17b-3754aac7de64 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Received unexpected event network-vif-plugged-360af8e5-9c2d-468f-a325-ced8745d90f1 for instance with vm_state active and task_state None.
Jan 26 19:44:56 compute-0 nova_compute[183177]: 2026-01-26 19:44:56.750 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:56 compute-0 nova_compute[183177]: 2026-01-26 19:44:56.827 183181 DEBUG oslo_concurrency.lockutils [None req-b7d29678-8665-4035-a4b8-5aac6d39afe6 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.930s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:44:58 compute-0 podman[207722]: 2026-01-26 19:44:58.386294174 +0000 UTC m=+0.133518124 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Jan 26 19:44:59 compute-0 nova_compute[183177]: 2026-01-26 19:44:59.207 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:44:59 compute-0 podman[192499]: time="2026-01-26T19:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:44:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:44:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2632 "" "Go-http-client/1.1"
Jan 26 19:45:01 compute-0 podman[207752]: 2026-01-26 19:45:01.328767676 +0000 UTC m=+0.070361234 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:45:01 compute-0 podman[207751]: 2026-01-26 19:45:01.331825959 +0000 UTC m=+0.072991676 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Jan 26 19:45:01 compute-0 openstack_network_exporter[195363]: ERROR   19:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:45:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:45:01 compute-0 openstack_network_exporter[195363]: ERROR   19:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:45:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:45:01 compute-0 nova_compute[183177]: 2026-01-26 19:45:01.751 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:04 compute-0 nova_compute[183177]: 2026-01-26 19:45:04.209 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:06 compute-0 nova_compute[183177]: 2026-01-26 19:45:06.753 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:07 compute-0 ovn_controller[95396]: 2026-01-26T19:45:07Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e5:9b:79 10.100.0.6
Jan 26 19:45:07 compute-0 ovn_controller[95396]: 2026-01-26T19:45:07Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e5:9b:79 10.100.0.6
Jan 26 19:45:09 compute-0 nova_compute[183177]: 2026-01-26 19:45:09.211 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:09 compute-0 podman[207801]: 2026-01-26 19:45:09.318911023 +0000 UTC m=+0.065801641 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 19:45:11 compute-0 nova_compute[183177]: 2026-01-26 19:45:11.755 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:14 compute-0 nova_compute[183177]: 2026-01-26 19:45:14.247 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:16 compute-0 nova_compute[183177]: 2026-01-26 19:45:16.757 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:16 compute-0 nova_compute[183177]: 2026-01-26 19:45:16.820 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:16 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:45:16.819 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:45:16 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:45:16.821 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:45:19 compute-0 nova_compute[183177]: 2026-01-26 19:45:19.250 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:45:19.822 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:45:21 compute-0 nova_compute[183177]: 2026-01-26 19:45:21.759 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:45:24.045 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:45:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:45:24.045 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:45:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:45:24.045 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:45:24 compute-0 nova_compute[183177]: 2026-01-26 19:45:24.254 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:26 compute-0 nova_compute[183177]: 2026-01-26 19:45:26.761 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:29 compute-0 nova_compute[183177]: 2026-01-26 19:45:29.256 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:29 compute-0 podman[207829]: 2026-01-26 19:45:29.679660082 +0000 UTC m=+0.414447454 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 19:45:29 compute-0 podman[192499]: time="2026-01-26T19:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:45:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:45:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Jan 26 19:45:31 compute-0 openstack_network_exporter[195363]: ERROR   19:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:45:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:45:31 compute-0 openstack_network_exporter[195363]: ERROR   19:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:45:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:45:31 compute-0 nova_compute[183177]: 2026-01-26 19:45:31.763 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:32 compute-0 podman[207857]: 2026-01-26 19:45:32.307430256 +0000 UTC m=+0.058683370 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4)
Jan 26 19:45:32 compute-0 podman[207856]: 2026-01-26 19:45:32.336308563 +0000 UTC m=+0.078671668 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 19:45:34 compute-0 nova_compute[183177]: 2026-01-26 19:45:34.258 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:34 compute-0 sshd-session[207898]: Invalid user postgres from 193.32.162.151 port 60698
Jan 26 19:45:34 compute-0 sshd-session[207898]: Connection closed by invalid user postgres 193.32.162.151 port 60698 [preauth]
Jan 26 19:45:36 compute-0 nova_compute[183177]: 2026-01-26 19:45:36.765 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:39 compute-0 nova_compute[183177]: 2026-01-26 19:45:39.305 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:40 compute-0 podman[207900]: 2026-01-26 19:45:40.34283029 +0000 UTC m=+0.079037849 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 19:45:41 compute-0 nova_compute[183177]: 2026-01-26 19:45:41.770 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:44 compute-0 nova_compute[183177]: 2026-01-26 19:45:44.323 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:45 compute-0 ovn_controller[95396]: 2026-01-26T19:45:45Z|00094|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory
Jan 26 19:45:46 compute-0 nova_compute[183177]: 2026-01-26 19:45:46.774 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:48 compute-0 nova_compute[183177]: 2026-01-26 19:45:48.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:45:49 compute-0 nova_compute[183177]: 2026-01-26 19:45:49.360 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:51 compute-0 nova_compute[183177]: 2026-01-26 19:45:51.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:45:51 compute-0 nova_compute[183177]: 2026-01-26 19:45:51.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:45:51 compute-0 nova_compute[183177]: 2026-01-26 19:45:51.611 183181 DEBUG nova.virt.libvirt.driver [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Creating tmpfile /var/lib/nova/instances/tmp_khr_hbg to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Jan 26 19:45:51 compute-0 nova_compute[183177]: 2026-01-26 19:45:51.612 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:45:51 compute-0 nova_compute[183177]: 2026-01-26 19:45:51.670 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:45:51 compute-0 nova_compute[183177]: 2026-01-26 19:45:51.670 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:45:51 compute-0 nova_compute[183177]: 2026-01-26 19:45:51.671 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:45:51 compute-0 nova_compute[183177]: 2026-01-26 19:45:51.671 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:45:51 compute-0 nova_compute[183177]: 2026-01-26 19:45:51.708 183181 DEBUG nova.compute.manager [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_khr_hbg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Jan 26 19:45:51 compute-0 nova_compute[183177]: 2026-01-26 19:45:51.810 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:52 compute-0 nova_compute[183177]: 2026-01-26 19:45:52.725 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:45:52 compute-0 nova_compute[183177]: 2026-01-26 19:45:52.823 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:45:52 compute-0 nova_compute[183177]: 2026-01-26 19:45:52.824 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:45:52 compute-0 nova_compute[183177]: 2026-01-26 19:45:52.913 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:45:53 compute-0 nova_compute[183177]: 2026-01-26 19:45:53.133 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:45:53 compute-0 nova_compute[183177]: 2026-01-26 19:45:53.136 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:45:53 compute-0 nova_compute[183177]: 2026-01-26 19:45:53.161 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:45:53 compute-0 nova_compute[183177]: 2026-01-26 19:45:53.162 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5651MB free_disk=73.07410430908203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:45:53 compute-0 nova_compute[183177]: 2026-01-26 19:45:53.162 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:45:53 compute-0 nova_compute[183177]: 2026-01-26 19:45:53.163 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:45:53 compute-0 nova_compute[183177]: 2026-01-26 19:45:53.763 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:45:54 compute-0 nova_compute[183177]: 2026-01-26 19:45:54.215 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 388082f6-0664-4bc2-844f-e9545548138b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:45:54 compute-0 nova_compute[183177]: 2026-01-26 19:45:54.362 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:54 compute-0 nova_compute[183177]: 2026-01-26 19:45:54.723 183181 WARNING nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 1836bbd4-abe5-4658-8980-6c9ef3a08026 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Jan 26 19:45:54 compute-0 nova_compute[183177]: 2026-01-26 19:45:54.723 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:45:54 compute-0 nova_compute[183177]: 2026-01-26 19:45:54.724 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:45:53 up  1:10,  0 user,  load average: 0.14, 0.25, 0.40\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_315dd5c96f24487b9b621d7237bc35ed': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:45:54 compute-0 nova_compute[183177]: 2026-01-26 19:45:54.804 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:45:55 compute-0 nova_compute[183177]: 2026-01-26 19:45:55.312 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:45:55 compute-0 nova_compute[183177]: 2026-01-26 19:45:55.824 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:45:55 compute-0 nova_compute[183177]: 2026-01-26 19:45:55.825 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.662s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:45:56 compute-0 nova_compute[183177]: 2026-01-26 19:45:56.813 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:57 compute-0 nova_compute[183177]: 2026-01-26 19:45:57.803 183181 DEBUG nova.compute.manager [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_khr_hbg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1836bbd4-abe5-4658-8980-6c9ef3a08026',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Jan 26 19:45:57 compute-0 nova_compute[183177]: 2026-01-26 19:45:57.821 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:45:57 compute-0 nova_compute[183177]: 2026-01-26 19:45:57.821 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:45:57 compute-0 nova_compute[183177]: 2026-01-26 19:45:57.821 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:45:57 compute-0 nova_compute[183177]: 2026-01-26 19:45:57.821 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:45:57 compute-0 nova_compute[183177]: 2026-01-26 19:45:57.822 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:45:58 compute-0 nova_compute[183177]: 2026-01-26 19:45:58.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:45:58 compute-0 nova_compute[183177]: 2026-01-26 19:45:58.819 183181 DEBUG oslo_concurrency.lockutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-1836bbd4-abe5-4658-8980-6c9ef3a08026" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:45:58 compute-0 nova_compute[183177]: 2026-01-26 19:45:58.820 183181 DEBUG oslo_concurrency.lockutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-1836bbd4-abe5-4658-8980-6c9ef3a08026" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:45:58 compute-0 nova_compute[183177]: 2026-01-26 19:45:58.820 183181 DEBUG nova.network.neutron [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:45:59 compute-0 nova_compute[183177]: 2026-01-26 19:45:59.328 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:45:59 compute-0 nova_compute[183177]: 2026-01-26 19:45:59.365 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:45:59 compute-0 podman[192499]: time="2026-01-26T19:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:45:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:45:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Jan 26 19:46:00 compute-0 podman[207932]: 2026-01-26 19:46:00.388067077 +0000 UTC m=+0.130144335 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 19:46:01 compute-0 nova_compute[183177]: 2026-01-26 19:46:01.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:46:01 compute-0 openstack_network_exporter[195363]: ERROR   19:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:46:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:46:01 compute-0 openstack_network_exporter[195363]: ERROR   19:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:46:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:46:01 compute-0 nova_compute[183177]: 2026-01-26 19:46:01.815 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:02 compute-0 nova_compute[183177]: 2026-01-26 19:46:02.430 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:03 compute-0 podman[207959]: 2026-01-26 19:46:03.313270314 +0000 UTC m=+0.059255303 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, distribution-scope=public, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_id=openstack_network_exporter)
Jan 26 19:46:03 compute-0 podman[207960]: 2026-01-26 19:46:03.329717925 +0000 UTC m=+0.058460712 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260120)
Jan 26 19:46:03 compute-0 nova_compute[183177]: 2026-01-26 19:46:03.479 183181 DEBUG nova.network.neutron [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Updating instance_info_cache with network_info: [{"id": "97b66273-5273-40e0-83f6-bc5dc63424aa", "address": "fa:16:3e:9f:84:67", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b66273-52", "ovs_interfaceid": "97b66273-5273-40e0-83f6-bc5dc63424aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:46:03 compute-0 nova_compute[183177]: 2026-01-26 19:46:03.985 183181 DEBUG oslo_concurrency.lockutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-1836bbd4-abe5-4658-8980-6c9ef3a08026" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.011 183181 DEBUG nova.virt.libvirt.driver [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_khr_hbg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1836bbd4-abe5-4658-8980-6c9ef3a08026',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.012 183181 DEBUG nova.virt.libvirt.driver [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Creating instance directory: /var/lib/nova/instances/1836bbd4-abe5-4658-8980-6c9ef3a08026 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.013 183181 DEBUG nova.virt.libvirt.driver [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Creating disk.info with the contents: {'/var/lib/nova/instances/1836bbd4-abe5-4658-8980-6c9ef3a08026/disk': 'qcow2', '/var/lib/nova/instances/1836bbd4-abe5-4658-8980-6c9ef3a08026/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.014 183181 DEBUG nova.virt.libvirt.driver [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.015 183181 DEBUG nova.objects.instance [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1836bbd4-abe5-4658-8980-6c9ef3a08026 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.413 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.524 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.533 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.536 183181 DEBUG oslo_concurrency.processutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.625 183181 DEBUG oslo_concurrency.processutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.626 183181 DEBUG oslo_concurrency.lockutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.627 183181 DEBUG oslo_concurrency.lockutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.628 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.635 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.636 183181 DEBUG oslo_concurrency.processutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.701 183181 DEBUG oslo_concurrency.processutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.703 183181 DEBUG oslo_concurrency.processutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/1836bbd4-abe5-4658-8980-6c9ef3a08026/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.738 183181 DEBUG oslo_concurrency.processutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/1836bbd4-abe5-4658-8980-6c9ef3a08026/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.739 183181 DEBUG oslo_concurrency.lockutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.740 183181 DEBUG oslo_concurrency.processutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.790 183181 DEBUG oslo_concurrency.processutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.792 183181 DEBUG nova.virt.disk.api [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Checking if we can resize image /var/lib/nova/instances/1836bbd4-abe5-4658-8980-6c9ef3a08026/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.793 183181 DEBUG oslo_concurrency.processutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1836bbd4-abe5-4658-8980-6c9ef3a08026/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.849 183181 DEBUG oslo_concurrency.processutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1836bbd4-abe5-4658-8980-6c9ef3a08026/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.851 183181 DEBUG nova.virt.disk.api [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Cannot resize image /var/lib/nova/instances/1836bbd4-abe5-4658-8980-6c9ef3a08026/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:46:04 compute-0 nova_compute[183177]: 2026-01-26 19:46:04.852 183181 DEBUG nova.objects.instance [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid 1836bbd4-abe5-4658-8980-6c9ef3a08026 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.363 183181 DEBUG nova.objects.base [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Object Instance<1836bbd4-abe5-4658-8980-6c9ef3a08026> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.364 183181 DEBUG oslo_concurrency.processutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1836bbd4-abe5-4658-8980-6c9ef3a08026/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.403 183181 DEBUG oslo_concurrency.processutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1836bbd4-abe5-4658-8980-6c9ef3a08026/disk.config 497664" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.405 183181 DEBUG nova.virt.libvirt.driver [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.407 183181 DEBUG nova.virt.libvirt.vif [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T19:44:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-976269844',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-976269844',id=11,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:45:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='315dd5c96f24487b9b621d7237bc35ed',ramdisk_id='',reservation_id='r-j54743b4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1074421461',owner_user_name='tempest-TestExecuteBasicStrategy-1074421461-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:45:17Z,user_data=None,user_id='2f579cdd19584229bbf4f240effa28f3',uuid=1836bbd4-abe5-4658-8980-6c9ef3a08026,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "97b66273-5273-40e0-83f6-bc5dc63424aa", "address": "fa:16:3e:9f:84:67", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap97b66273-52", "ovs_interfaceid": "97b66273-5273-40e0-83f6-bc5dc63424aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.408 183181 DEBUG nova.network.os_vif_util [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "97b66273-5273-40e0-83f6-bc5dc63424aa", "address": "fa:16:3e:9f:84:67", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap97b66273-52", "ovs_interfaceid": "97b66273-5273-40e0-83f6-bc5dc63424aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.409 183181 DEBUG nova.network.os_vif_util [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:84:67,bridge_name='br-int',has_traffic_filtering=True,id=97b66273-5273-40e0-83f6-bc5dc63424aa,network=Network(0501a2fa-d9aa-43e0-a6c7-2ea169228252),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b66273-52') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.410 183181 DEBUG os_vif [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:84:67,bridge_name='br-int',has_traffic_filtering=True,id=97b66273-5273-40e0-83f6-bc5dc63424aa,network=Network(0501a2fa-d9aa-43e0-a6c7-2ea169228252),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b66273-52') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.411 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.412 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.413 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.414 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.415 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '6ef2900e-d372-51f3-bf51-b8843ecf1ca3', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.417 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.419 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.424 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.425 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap97b66273-52, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.426 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap97b66273-52, col_values=(('qos', UUID('750c88e2-99b8-4f92-ab8d-157fcc78a506')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.426 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap97b66273-52, col_values=(('external_ids', {'iface-id': '97b66273-5273-40e0-83f6-bc5dc63424aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:84:67', 'vm-uuid': '1836bbd4-abe5-4658-8980-6c9ef3a08026'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:05 compute-0 NetworkManager[55489]: <info>  [1769456765.4308] manager: (tap97b66273-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.431 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.438 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.439 183181 INFO os_vif [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:84:67,bridge_name='br-int',has_traffic_filtering=True,id=97b66273-5273-40e0-83f6-bc5dc63424aa,network=Network(0501a2fa-d9aa-43e0-a6c7-2ea169228252),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b66273-52')
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.440 183181 DEBUG nova.virt.libvirt.driver [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.441 183181 DEBUG nova.compute.manager [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_khr_hbg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1836bbd4-abe5-4658-8980-6c9ef3a08026',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Jan 26 19:46:05 compute-0 nova_compute[183177]: 2026-01-26 19:46:05.442 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:06 compute-0 nova_compute[183177]: 2026-01-26 19:46:06.501 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:07 compute-0 nova_compute[183177]: 2026-01-26 19:46:07.040 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:07.042 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:46:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:07.044 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:46:07 compute-0 nova_compute[183177]: 2026-01-26 19:46:07.560 183181 DEBUG nova.network.neutron [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Port 97b66273-5273-40e0-83f6-bc5dc63424aa updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Jan 26 19:46:07 compute-0 nova_compute[183177]: 2026-01-26 19:46:07.572 183181 DEBUG nova.compute.manager [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_khr_hbg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1836bbd4-abe5-4658-8980-6c9ef3a08026',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Jan 26 19:46:09 compute-0 nova_compute[183177]: 2026-01-26 19:46:09.417 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:10 compute-0 nova_compute[183177]: 2026-01-26 19:46:10.466 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:10 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 26 19:46:10 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 26 19:46:10 compute-0 podman[208033]: 2026-01-26 19:46:10.847654272 +0000 UTC m=+0.100885354 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:46:10 compute-0 kernel: tap97b66273-52: entered promiscuous mode
Jan 26 19:46:10 compute-0 NetworkManager[55489]: <info>  [1769456770.9298] manager: (tap97b66273-52): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Jan 26 19:46:10 compute-0 ovn_controller[95396]: 2026-01-26T19:46:10Z|00095|binding|INFO|Claiming lport 97b66273-5273-40e0-83f6-bc5dc63424aa for this additional chassis.
Jan 26 19:46:10 compute-0 nova_compute[183177]: 2026-01-26 19:46:10.930 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:10 compute-0 ovn_controller[95396]: 2026-01-26T19:46:10Z|00096|binding|INFO|97b66273-5273-40e0-83f6-bc5dc63424aa: Claiming fa:16:3e:9f:84:67 10.100.0.12
Jan 26 19:46:10 compute-0 ovn_controller[95396]: 2026-01-26T19:46:10Z|00097|binding|INFO|Setting lport 97b66273-5273-40e0-83f6-bc5dc63424aa ovn-installed in OVS
Jan 26 19:46:10 compute-0 nova_compute[183177]: 2026-01-26 19:46:10.945 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:10 compute-0 nova_compute[183177]: 2026-01-26 19:46:10.947 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:10.949 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:84:67 10.100.0.12'], port_security=['fa:16:3e:9f:84:67 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1836bbd4-abe5-4658-8980-6c9ef3a08026', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '315dd5c96f24487b9b621d7237bc35ed', 'neutron:revision_number': '10', 'neutron:security_group_ids': '0b324a45-9260-4a52-b269-ee106dc2cc6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b55a762-8a6e-494a-8fe1-e2f6fe7cb4f9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=97b66273-5273-40e0-83f6-bc5dc63424aa) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:46:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:10.950 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 97b66273-5273-40e0-83f6-bc5dc63424aa in datapath 0501a2fa-d9aa-43e0-a6c7-2ea169228252 unbound from our chassis
Jan 26 19:46:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:10.952 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0501a2fa-d9aa-43e0-a6c7-2ea169228252
Jan 26 19:46:10 compute-0 nova_compute[183177]: 2026-01-26 19:46:10.953 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:10.976 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2879ff9a-357c-4f43-b506-057eb91f17c7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:10 compute-0 systemd-machined[154465]: New machine qemu-8-instance-0000000b.
Jan 26 19:46:10 compute-0 systemd-udevd[208094]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:46:11 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000b.
Jan 26 19:46:11 compute-0 NetworkManager[55489]: <info>  [1769456771.0214] device (tap97b66273-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:46:11 compute-0 NetworkManager[55489]: <info>  [1769456771.0225] device (tap97b66273-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:46:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:11.028 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad47ea5-cf5c-4c7f-b1c8-740dcd813db1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:11.033 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[c6661c01-1f48-4293-b8f7-6d74313f524c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:11.046 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:11.084 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[e54516a5-1724-456d-a056-566dcc649011]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:11.109 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[483761c2-d530-420e-9ab2-bfded45c3a45]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0501a2fa-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:bd:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415334, 'reachable_time': 31502, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208104, 'error': None, 'target': 'ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:11.136 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[fd507b63-da5e-465b-b439-f72cfe183d26]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0501a2fa-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415349, 'tstamp': 415349}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208106, 'error': None, 'target': 'ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0501a2fa-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415355, 'tstamp': 415355}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208106, 'error': None, 'target': 'ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:11.138 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0501a2fa-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:11 compute-0 nova_compute[183177]: 2026-01-26 19:46:11.140 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:11 compute-0 nova_compute[183177]: 2026-01-26 19:46:11.141 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:11.141 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0501a2fa-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:11.142 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:46:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:11.142 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0501a2fa-d0, col_values=(('external_ids', {'iface-id': 'a249be39-6b4f-40bc-acd2-fb81baa61f02'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:11.142 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:46:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:11.144 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1e93ffd1-ed06-4f7e-a631-08bad86a89ed]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-0501a2fa-d9aa-43e0-a6c7-2ea169228252\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/0501a2fa-d9aa-43e0-a6c7-2ea169228252.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 0501a2fa-d9aa-43e0-a6c7-2ea169228252\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:13 compute-0 ovn_controller[95396]: 2026-01-26T19:46:13Z|00098|binding|INFO|Claiming lport 97b66273-5273-40e0-83f6-bc5dc63424aa for this chassis.
Jan 26 19:46:13 compute-0 ovn_controller[95396]: 2026-01-26T19:46:13Z|00099|binding|INFO|97b66273-5273-40e0-83f6-bc5dc63424aa: Claiming fa:16:3e:9f:84:67 10.100.0.12
Jan 26 19:46:13 compute-0 ovn_controller[95396]: 2026-01-26T19:46:13Z|00100|binding|INFO|Setting lport 97b66273-5273-40e0-83f6-bc5dc63424aa up in Southbound
Jan 26 19:46:14 compute-0 nova_compute[183177]: 2026-01-26 19:46:14.418 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:15 compute-0 nova_compute[183177]: 2026-01-26 19:46:15.483 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:15 compute-0 nova_compute[183177]: 2026-01-26 19:46:15.499 183181 INFO nova.compute.manager [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Post operation of migration started
Jan 26 19:46:15 compute-0 nova_compute[183177]: 2026-01-26 19:46:15.500 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:16 compute-0 nova_compute[183177]: 2026-01-26 19:46:16.542 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:16 compute-0 nova_compute[183177]: 2026-01-26 19:46:16.543 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:16 compute-0 nova_compute[183177]: 2026-01-26 19:46:16.939 183181 DEBUG oslo_concurrency.lockutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-1836bbd4-abe5-4658-8980-6c9ef3a08026" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:46:16 compute-0 nova_compute[183177]: 2026-01-26 19:46:16.939 183181 DEBUG oslo_concurrency.lockutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-1836bbd4-abe5-4658-8980-6c9ef3a08026" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:46:16 compute-0 nova_compute[183177]: 2026-01-26 19:46:16.939 183181 DEBUG nova.network.neutron [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:46:17 compute-0 nova_compute[183177]: 2026-01-26 19:46:17.629 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:19 compute-0 nova_compute[183177]: 2026-01-26 19:46:19.420 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:19 compute-0 nova_compute[183177]: 2026-01-26 19:46:19.672 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:20 compute-0 nova_compute[183177]: 2026-01-26 19:46:20.525 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:20 compute-0 nova_compute[183177]: 2026-01-26 19:46:20.622 183181 DEBUG nova.network.neutron [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Updating instance_info_cache with network_info: [{"id": "97b66273-5273-40e0-83f6-bc5dc63424aa", "address": "fa:16:3e:9f:84:67", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b66273-52", "ovs_interfaceid": "97b66273-5273-40e0-83f6-bc5dc63424aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:46:21 compute-0 nova_compute[183177]: 2026-01-26 19:46:21.130 183181 DEBUG oslo_concurrency.lockutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-1836bbd4-abe5-4658-8980-6c9ef3a08026" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:46:21 compute-0 nova_compute[183177]: 2026-01-26 19:46:21.652 183181 DEBUG oslo_concurrency.lockutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:21 compute-0 nova_compute[183177]: 2026-01-26 19:46:21.653 183181 DEBUG oslo_concurrency.lockutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:21 compute-0 nova_compute[183177]: 2026-01-26 19:46:21.654 183181 DEBUG oslo_concurrency.lockutils [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:21 compute-0 nova_compute[183177]: 2026-01-26 19:46:21.662 183181 INFO nova.virt.libvirt.driver [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 19:46:21 compute-0 virtqemud[182929]: Domain id=8 name='instance-0000000b' uuid=1836bbd4-abe5-4658-8980-6c9ef3a08026 is tainted: custom-monitor
Jan 26 19:46:22 compute-0 nova_compute[183177]: 2026-01-26 19:46:22.671 183181 INFO nova.virt.libvirt.driver [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 19:46:23 compute-0 nova_compute[183177]: 2026-01-26 19:46:23.678 183181 INFO nova.virt.libvirt.driver [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 19:46:23 compute-0 nova_compute[183177]: 2026-01-26 19:46:23.684 183181 DEBUG nova.compute.manager [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 19:46:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:24.047 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:24.048 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:24.049 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:24 compute-0 nova_compute[183177]: 2026-01-26 19:46:24.195 183181 DEBUG nova.objects.instance [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 19:46:24 compute-0 nova_compute[183177]: 2026-01-26 19:46:24.456 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:25 compute-0 nova_compute[183177]: 2026-01-26 19:46:25.214 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:25 compute-0 nova_compute[183177]: 2026-01-26 19:46:25.515 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:25 compute-0 nova_compute[183177]: 2026-01-26 19:46:25.516 183181 WARNING neutronclient.v2_0.client [None req-fb069449-1615-405f-b76c-e551ae5293a1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:25 compute-0 nova_compute[183177]: 2026-01-26 19:46:25.619 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:29 compute-0 nova_compute[183177]: 2026-01-26 19:46:29.476 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:29 compute-0 podman[192499]: time="2026-01-26T19:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:46:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:46:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Jan 26 19:46:30 compute-0 nova_compute[183177]: 2026-01-26 19:46:30.657 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:31 compute-0 podman[208129]: 2026-01-26 19:46:31.370622113 +0000 UTC m=+0.113666254 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:46:31 compute-0 openstack_network_exporter[195363]: ERROR   19:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:46:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:46:31 compute-0 openstack_network_exporter[195363]: ERROR   19:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:46:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:46:32 compute-0 nova_compute[183177]: 2026-01-26 19:46:32.997 183181 DEBUG oslo_concurrency.lockutils [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquiring lock "1836bbd4-abe5-4658-8980-6c9ef3a08026" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:32 compute-0 nova_compute[183177]: 2026-01-26 19:46:32.998 183181 DEBUG oslo_concurrency.lockutils [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "1836bbd4-abe5-4658-8980-6c9ef3a08026" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:32.999 183181 DEBUG oslo_concurrency.lockutils [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquiring lock "1836bbd4-abe5-4658-8980-6c9ef3a08026-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.000 183181 DEBUG oslo_concurrency.lockutils [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "1836bbd4-abe5-4658-8980-6c9ef3a08026-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.000 183181 DEBUG oslo_concurrency.lockutils [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "1836bbd4-abe5-4658-8980-6c9ef3a08026-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.054 183181 INFO nova.compute.manager [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Terminating instance
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.579 183181 DEBUG nova.compute.manager [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 19:46:33 compute-0 kernel: tap97b66273-52 (unregistering): left promiscuous mode
Jan 26 19:46:33 compute-0 NetworkManager[55489]: <info>  [1769456793.6112] device (tap97b66273-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.616 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:33 compute-0 ovn_controller[95396]: 2026-01-26T19:46:33Z|00101|binding|INFO|Releasing lport 97b66273-5273-40e0-83f6-bc5dc63424aa from this chassis (sb_readonly=0)
Jan 26 19:46:33 compute-0 ovn_controller[95396]: 2026-01-26T19:46:33Z|00102|binding|INFO|Setting lport 97b66273-5273-40e0-83f6-bc5dc63424aa down in Southbound
Jan 26 19:46:33 compute-0 ovn_controller[95396]: 2026-01-26T19:46:33Z|00103|binding|INFO|Removing iface tap97b66273-52 ovn-installed in OVS
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.621 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.628 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:84:67 10.100.0.12'], port_security=['fa:16:3e:9f:84:67 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1836bbd4-abe5-4658-8980-6c9ef3a08026', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '315dd5c96f24487b9b621d7237bc35ed', 'neutron:revision_number': '15', 'neutron:security_group_ids': '0b324a45-9260-4a52-b269-ee106dc2cc6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b55a762-8a6e-494a-8fe1-e2f6fe7cb4f9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=97b66273-5273-40e0-83f6-bc5dc63424aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.631 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 97b66273-5273-40e0-83f6-bc5dc63424aa in datapath 0501a2fa-d9aa-43e0-a6c7-2ea169228252 unbound from our chassis
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.633 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0501a2fa-d9aa-43e0-a6c7-2ea169228252
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.638 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:33 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Jan 26 19:46:33 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Consumed 2.176s CPU time.
Jan 26 19:46:33 compute-0 systemd-machined[154465]: Machine qemu-8-instance-0000000b terminated.
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.660 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1668a706-4c4b-4431-baf4-de2eded79d60]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.704 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[3a1cc9be-8077-4d9c-adb6-27a0c64b98d4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.709 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[541977d3-488a-477f-a7fd-01ffa07deae5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:33 compute-0 podman[208159]: 2026-01-26 19:46:33.727316341 +0000 UTC m=+0.069116395 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 26 19:46:33 compute-0 podman[208157]: 2026-01-26 19:46:33.731720581 +0000 UTC m=+0.086954842 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6)
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.747 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[dafb5bb2-2271-4ba9-8557-a61d1fdecdf4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.770 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[4efaa2b3-db0e-4d06-8f46-c63869848d96]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0501a2fa-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:bd:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415334, 'reachable_time': 31502, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208205, 'error': None, 'target': 'ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.790 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3a14626c-78ef-4628-bb1f-fafa10d9fbcb]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0501a2fa-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415349, 'tstamp': 415349}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208206, 'error': None, 'target': 'ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0501a2fa-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415355, 'tstamp': 415355}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208206, 'error': None, 'target': 'ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.791 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0501a2fa-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.795 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.799 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0501a2fa-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.799 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.800 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0501a2fa-d0, col_values=(('external_ids', {'iface-id': 'a249be39-6b4f-40bc-acd2-fb81baa61f02'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.800 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:46:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:33.802 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a080b0-078d-469b-98ec-3de432093d87]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-0501a2fa-d9aa-43e0-a6c7-2ea169228252\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/0501a2fa-d9aa-43e0-a6c7-2ea169228252.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 0501a2fa-d9aa-43e0-a6c7-2ea169228252\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.862 183181 INFO nova.virt.libvirt.driver [-] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Instance destroyed successfully.
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.864 183181 DEBUG nova.objects.instance [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lazy-loading 'resources' on Instance uuid 1836bbd4-abe5-4658-8980-6c9ef3a08026 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.872 183181 DEBUG nova.compute.manager [req-3a51ac46-cbc4-485a-bf54-98f2d51fb52c req-cbf38a50-b228-4426-9ad5-050676109ecf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Received event network-vif-unplugged-97b66273-5273-40e0-83f6-bc5dc63424aa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.873 183181 DEBUG oslo_concurrency.lockutils [req-3a51ac46-cbc4-485a-bf54-98f2d51fb52c req-cbf38a50-b228-4426-9ad5-050676109ecf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "1836bbd4-abe5-4658-8980-6c9ef3a08026-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.873 183181 DEBUG oslo_concurrency.lockutils [req-3a51ac46-cbc4-485a-bf54-98f2d51fb52c req-cbf38a50-b228-4426-9ad5-050676109ecf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "1836bbd4-abe5-4658-8980-6c9ef3a08026-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.873 183181 DEBUG oslo_concurrency.lockutils [req-3a51ac46-cbc4-485a-bf54-98f2d51fb52c req-cbf38a50-b228-4426-9ad5-050676109ecf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "1836bbd4-abe5-4658-8980-6c9ef3a08026-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.874 183181 DEBUG nova.compute.manager [req-3a51ac46-cbc4-485a-bf54-98f2d51fb52c req-cbf38a50-b228-4426-9ad5-050676109ecf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] No waiting events found dispatching network-vif-unplugged-97b66273-5273-40e0-83f6-bc5dc63424aa pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:46:33 compute-0 nova_compute[183177]: 2026-01-26 19:46:33.874 183181 DEBUG nova.compute.manager [req-3a51ac46-cbc4-485a-bf54-98f2d51fb52c req-cbf38a50-b228-4426-9ad5-050676109ecf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Received event network-vif-unplugged-97b66273-5273-40e0-83f6-bc5dc63424aa for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.379 183181 DEBUG nova.virt.libvirt.vif [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2026-01-26T19:44:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-976269844',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-976269844',id=11,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:45:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='315dd5c96f24487b9b621d7237bc35ed',ramdisk_id='',reservation_id='r-j54743b4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',clean_attempts='1',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1074421461',owner_user_name='tempest-TestExecuteBasicStrategy-1074421461-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:46:24Z,user_data=None,user_id='2f579cdd19584229bbf4f240effa28f3',uuid=1836bbd4-abe5-4658-8980-6c9ef3a08026,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "97b66273-5273-40e0-83f6-bc5dc63424aa", "address": "fa:16:3e:9f:84:67", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b66273-52", "ovs_interfaceid": "97b66273-5273-40e0-83f6-bc5dc63424aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.379 183181 DEBUG nova.network.os_vif_util [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Converting VIF {"id": "97b66273-5273-40e0-83f6-bc5dc63424aa", "address": "fa:16:3e:9f:84:67", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97b66273-52", "ovs_interfaceid": "97b66273-5273-40e0-83f6-bc5dc63424aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.380 183181 DEBUG nova.network.os_vif_util [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:84:67,bridge_name='br-int',has_traffic_filtering=True,id=97b66273-5273-40e0-83f6-bc5dc63424aa,network=Network(0501a2fa-d9aa-43e0-a6c7-2ea169228252),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b66273-52') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.380 183181 DEBUG os_vif [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:84:67,bridge_name='br-int',has_traffic_filtering=True,id=97b66273-5273-40e0-83f6-bc5dc63424aa,network=Network(0501a2fa-d9aa-43e0-a6c7-2ea169228252),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b66273-52') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.381 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.381 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap97b66273-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.383 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.384 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.385 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.385 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.385 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=750c88e2-99b8-4f92-ab8d-157fcc78a506) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.386 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.387 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.389 183181 INFO os_vif [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:84:67,bridge_name='br-int',has_traffic_filtering=True,id=97b66273-5273-40e0-83f6-bc5dc63424aa,network=Network(0501a2fa-d9aa-43e0-a6c7-2ea169228252),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97b66273-52')
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.389 183181 INFO nova.virt.libvirt.driver [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Deleting instance files /var/lib/nova/instances/1836bbd4-abe5-4658-8980-6c9ef3a08026_del
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.390 183181 INFO nova.virt.libvirt.driver [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Deletion of /var/lib/nova/instances/1836bbd4-abe5-4658-8980-6c9ef3a08026_del complete
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.478 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.903 183181 INFO nova.compute.manager [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Took 1.32 seconds to destroy the instance on the hypervisor.
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.904 183181 DEBUG oslo.service.backend._eventlet.loopingcall [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.904 183181 DEBUG nova.compute.manager [-] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.905 183181 DEBUG nova.network.neutron [-] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 19:46:34 compute-0 nova_compute[183177]: 2026-01-26 19:46:34.905 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:35 compute-0 nova_compute[183177]: 2026-01-26 19:46:35.556 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:35 compute-0 nova_compute[183177]: 2026-01-26 19:46:35.964 183181 DEBUG nova.compute.manager [req-3a1efd37-8fc5-4743-b977-c111cdae577f req-0550966a-ab09-4638-8e26-cefc1522aa44 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Received event network-vif-unplugged-97b66273-5273-40e0-83f6-bc5dc63424aa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:46:35 compute-0 nova_compute[183177]: 2026-01-26 19:46:35.965 183181 DEBUG oslo_concurrency.lockutils [req-3a1efd37-8fc5-4743-b977-c111cdae577f req-0550966a-ab09-4638-8e26-cefc1522aa44 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "1836bbd4-abe5-4658-8980-6c9ef3a08026-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:35 compute-0 nova_compute[183177]: 2026-01-26 19:46:35.965 183181 DEBUG oslo_concurrency.lockutils [req-3a1efd37-8fc5-4743-b977-c111cdae577f req-0550966a-ab09-4638-8e26-cefc1522aa44 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "1836bbd4-abe5-4658-8980-6c9ef3a08026-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:35 compute-0 nova_compute[183177]: 2026-01-26 19:46:35.965 183181 DEBUG oslo_concurrency.lockutils [req-3a1efd37-8fc5-4743-b977-c111cdae577f req-0550966a-ab09-4638-8e26-cefc1522aa44 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "1836bbd4-abe5-4658-8980-6c9ef3a08026-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:35 compute-0 nova_compute[183177]: 2026-01-26 19:46:35.966 183181 DEBUG nova.compute.manager [req-3a1efd37-8fc5-4743-b977-c111cdae577f req-0550966a-ab09-4638-8e26-cefc1522aa44 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] No waiting events found dispatching network-vif-unplugged-97b66273-5273-40e0-83f6-bc5dc63424aa pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:46:35 compute-0 nova_compute[183177]: 2026-01-26 19:46:35.966 183181 DEBUG nova.compute.manager [req-3a1efd37-8fc5-4743-b977-c111cdae577f req-0550966a-ab09-4638-8e26-cefc1522aa44 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Received event network-vif-unplugged-97b66273-5273-40e0-83f6-bc5dc63424aa for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:46:36 compute-0 nova_compute[183177]: 2026-01-26 19:46:36.622 183181 DEBUG nova.compute.manager [req-1820f488-1700-43f4-b30a-ff139231e923 req-d111404a-fe45-451a-9ed3-761011528d7d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Received event network-vif-deleted-97b66273-5273-40e0-83f6-bc5dc63424aa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:46:36 compute-0 nova_compute[183177]: 2026-01-26 19:46:36.622 183181 INFO nova.compute.manager [req-1820f488-1700-43f4-b30a-ff139231e923 req-d111404a-fe45-451a-9ed3-761011528d7d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Neutron deleted interface 97b66273-5273-40e0-83f6-bc5dc63424aa; detaching it from the instance and deleting it from the info cache
Jan 26 19:46:36 compute-0 nova_compute[183177]: 2026-01-26 19:46:36.622 183181 DEBUG nova.network.neutron [req-1820f488-1700-43f4-b30a-ff139231e923 req-d111404a-fe45-451a-9ed3-761011528d7d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:46:37 compute-0 nova_compute[183177]: 2026-01-26 19:46:37.056 183181 DEBUG nova.network.neutron [-] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:46:37 compute-0 nova_compute[183177]: 2026-01-26 19:46:37.130 183181 DEBUG nova.compute.manager [req-1820f488-1700-43f4-b30a-ff139231e923 req-d111404a-fe45-451a-9ed3-761011528d7d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Detach interface failed, port_id=97b66273-5273-40e0-83f6-bc5dc63424aa, reason: Instance 1836bbd4-abe5-4658-8980-6c9ef3a08026 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 19:46:37 compute-0 nova_compute[183177]: 2026-01-26 19:46:37.563 183181 INFO nova.compute.manager [-] [instance: 1836bbd4-abe5-4658-8980-6c9ef3a08026] Took 2.66 seconds to deallocate network for instance.
Jan 26 19:46:38 compute-0 nova_compute[183177]: 2026-01-26 19:46:38.087 183181 DEBUG oslo_concurrency.lockutils [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:38 compute-0 nova_compute[183177]: 2026-01-26 19:46:38.087 183181 DEBUG oslo_concurrency.lockutils [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:38 compute-0 nova_compute[183177]: 2026-01-26 19:46:38.099 183181 DEBUG oslo_concurrency.lockutils [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.012s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:38 compute-0 nova_compute[183177]: 2026-01-26 19:46:38.129 183181 INFO nova.scheduler.client.report [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Deleted allocations for instance 1836bbd4-abe5-4658-8980-6c9ef3a08026
Jan 26 19:46:39 compute-0 nova_compute[183177]: 2026-01-26 19:46:39.162 183181 DEBUG oslo_concurrency.lockutils [None req-73235279-1d9e-4c7e-9eb9-54ba0dcba381 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "1836bbd4-abe5-4658-8980-6c9ef3a08026" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.163s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:39 compute-0 nova_compute[183177]: 2026-01-26 19:46:39.433 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:39 compute-0 nova_compute[183177]: 2026-01-26 19:46:39.479 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:40 compute-0 nova_compute[183177]: 2026-01-26 19:46:40.257 183181 DEBUG oslo_concurrency.lockutils [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquiring lock "388082f6-0664-4bc2-844f-e9545548138b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:40 compute-0 nova_compute[183177]: 2026-01-26 19:46:40.257 183181 DEBUG oslo_concurrency.lockutils [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:40 compute-0 nova_compute[183177]: 2026-01-26 19:46:40.258 183181 DEBUG oslo_concurrency.lockutils [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquiring lock "388082f6-0664-4bc2-844f-e9545548138b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:40 compute-0 nova_compute[183177]: 2026-01-26 19:46:40.258 183181 DEBUG oslo_concurrency.lockutils [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:40 compute-0 nova_compute[183177]: 2026-01-26 19:46:40.259 183181 DEBUG oslo_concurrency.lockutils [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:40 compute-0 nova_compute[183177]: 2026-01-26 19:46:40.287 183181 INFO nova.compute.manager [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Terminating instance
Jan 26 19:46:40 compute-0 nova_compute[183177]: 2026-01-26 19:46:40.807 183181 DEBUG nova.compute.manager [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 19:46:40 compute-0 kernel: tap360af8e5-9c (unregistering): left promiscuous mode
Jan 26 19:46:40 compute-0 NetworkManager[55489]: <info>  [1769456800.8348] device (tap360af8e5-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:46:40 compute-0 ovn_controller[95396]: 2026-01-26T19:46:40Z|00104|binding|INFO|Releasing lport 360af8e5-9c2d-468f-a325-ced8745d90f1 from this chassis (sb_readonly=0)
Jan 26 19:46:40 compute-0 ovn_controller[95396]: 2026-01-26T19:46:40Z|00105|binding|INFO|Setting lport 360af8e5-9c2d-468f-a325-ced8745d90f1 down in Southbound
Jan 26 19:46:40 compute-0 nova_compute[183177]: 2026-01-26 19:46:40.851 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:40 compute-0 ovn_controller[95396]: 2026-01-26T19:46:40Z|00106|binding|INFO|Removing iface tap360af8e5-9c ovn-installed in OVS
Jan 26 19:46:40 compute-0 nova_compute[183177]: 2026-01-26 19:46:40.854 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:40 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:40.863 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:9b:79 10.100.0.6'], port_security=['fa:16:3e:e5:9b:79 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '388082f6-0664-4bc2-844f-e9545548138b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '315dd5c96f24487b9b621d7237bc35ed', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0b324a45-9260-4a52-b269-ee106dc2cc6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b55a762-8a6e-494a-8fe1-e2f6fe7cb4f9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=360af8e5-9c2d-468f-a325-ced8745d90f1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:46:40 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:40.864 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 360af8e5-9c2d-468f-a325-ced8745d90f1 in datapath 0501a2fa-d9aa-43e0-a6c7-2ea169228252 unbound from our chassis
Jan 26 19:46:40 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:40.866 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0501a2fa-d9aa-43e0-a6c7-2ea169228252, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:46:40 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:40.867 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[adc66b82-7bbd-4fcb-ab7b-4db1444d0f80]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:40 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:40.868 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252 namespace which is not needed anymore
Jan 26 19:46:40 compute-0 nova_compute[183177]: 2026-01-26 19:46:40.884 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:40 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 26 19:46:40 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Consumed 17.430s CPU time.
Jan 26 19:46:40 compute-0 systemd-machined[154465]: Machine qemu-7-instance-0000000a terminated.
Jan 26 19:46:40 compute-0 podman[208227]: 2026-01-26 19:46:40.979537701 +0000 UTC m=+0.063921233 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:46:41 compute-0 neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252[207707]: [NOTICE]   (207711) : haproxy version is 3.0.5-8e879a5
Jan 26 19:46:41 compute-0 neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252[207707]: [NOTICE]   (207711) : path to executable is /usr/sbin/haproxy
Jan 26 19:46:41 compute-0 neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252[207707]: [WARNING]  (207711) : Exiting Master process...
Jan 26 19:46:41 compute-0 podman[208272]: 2026-01-26 19:46:41.024743968 +0000 UTC m=+0.040002446 container kill c94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:46:41 compute-0 neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252[207707]: [ALERT]    (207711) : Current worker (207713) exited with code 143 (Terminated)
Jan 26 19:46:41 compute-0 neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252[207707]: [WARNING]  (207711) : All workers exited. Exiting... (0)
Jan 26 19:46:41 compute-0 systemd[1]: libpod-c94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8.scope: Deactivated successfully.
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.033 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.037 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:41 compute-0 podman[208288]: 2026-01-26 19:46:41.06384988 +0000 UTC m=+0.021299745 container died c94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.077 183181 INFO nova.virt.libvirt.driver [-] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Instance destroyed successfully.
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.078 183181 DEBUG nova.objects.instance [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lazy-loading 'resources' on Instance uuid 388082f6-0664-4bc2-844f-e9545548138b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:46:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8-userdata-shm.mount: Deactivated successfully.
Jan 26 19:46:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-d597fea71c6252501bcffe3ab981a04ff7f02c71a45bd992a16c25dc434448d6-merged.mount: Deactivated successfully.
Jan 26 19:46:41 compute-0 podman[208288]: 2026-01-26 19:46:41.103047913 +0000 UTC m=+0.060497768 container cleanup c94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 19:46:41 compute-0 systemd[1]: libpod-conmon-c94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8.scope: Deactivated successfully.
Jan 26 19:46:41 compute-0 podman[208293]: 2026-01-26 19:46:41.120549033 +0000 UTC m=+0.067649534 container remove c94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:46:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:41.145 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[48caec11-5c55-4795-8f22-2b6fafeb7b37]: (4, ("Mon Jan 26 07:46:40 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252 (c94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8)\nc94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8\nMon Jan 26 07:46:41 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252 (c94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8)\nc94dea87a80b011a8aa0278c36893d118f879dac932fe252030ffe72d69f5ee8\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:41.147 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba96740-a879-4f48-9e80-42654e6a8133]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:41.147 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0501a2fa-d9aa-43e0-a6c7-2ea169228252.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0501a2fa-d9aa-43e0-a6c7-2ea169228252.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:46:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:41.148 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[993e779f-7093-45a3-86d9-65a0b79700ee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:41.149 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0501a2fa-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:41 compute-0 kernel: tap0501a2fa-d0: left promiscuous mode
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.151 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.166 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:41.169 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e740eb90-37c1-4193-bfce-b0565a14b8d8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:41.183 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[cdbcb5e9-c16f-43bd-a8f1-7caefbea55ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:41.184 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[df92be3f-95fb-4a1f-aabe-4cae938b6535]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:41.206 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3e920a63-8ba9-456b-92bd-b2194146f58f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415325, 'reachable_time': 43263, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208335, 'error': None, 'target': 'ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:41.213 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0501a2fa-d9aa-43e0-a6c7-2ea169228252 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 19:46:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:46:41.213 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[dd3bb128-2f65-4d55-bf28-50dab5841ce8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:46:41 compute-0 systemd[1]: run-netns-ovnmeta\x2d0501a2fa\x2dd9aa\x2d43e0\x2da6c7\x2d2ea169228252.mount: Deactivated successfully.
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.584 183181 DEBUG nova.virt.libvirt.vif [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:44:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1795175879',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1795175879',id=10,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:44:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='315dd5c96f24487b9b621d7237bc35ed',ramdisk_id='',reservation_id='r-wvrrsxk9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1074421461',owner_user_name='tempest-TestExecuteBasicStrategy-1074421461-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:44:55Z,user_data=None,user_id='2f579cdd19584229bbf4f240effa28f3',uuid=388082f6-0664-4bc2-844f-e9545548138b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "360af8e5-9c2d-468f-a325-ced8745d90f1", "address": "fa:16:3e:e5:9b:79", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap360af8e5-9c", "ovs_interfaceid": "360af8e5-9c2d-468f-a325-ced8745d90f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.585 183181 DEBUG nova.network.os_vif_util [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Converting VIF {"id": "360af8e5-9c2d-468f-a325-ced8745d90f1", "address": "fa:16:3e:e5:9b:79", "network": {"id": "0501a2fa-d9aa-43e0-a6c7-2ea169228252", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-847491331-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adee17f8a0194f5eb330b178ca303941", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap360af8e5-9c", "ovs_interfaceid": "360af8e5-9c2d-468f-a325-ced8745d90f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.586 183181 DEBUG nova.network.os_vif_util [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:9b:79,bridge_name='br-int',has_traffic_filtering=True,id=360af8e5-9c2d-468f-a325-ced8745d90f1,network=Network(0501a2fa-d9aa-43e0-a6c7-2ea169228252),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap360af8e5-9c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.586 183181 DEBUG os_vif [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:9b:79,bridge_name='br-int',has_traffic_filtering=True,id=360af8e5-9c2d-468f-a325-ced8745d90f1,network=Network(0501a2fa-d9aa-43e0-a6c7-2ea169228252),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap360af8e5-9c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.588 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.589 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap360af8e5-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.595 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.596 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.597 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d2a2bf38-d1e0-4587-8feb-baed6e347f39) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.599 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.602 183181 INFO os_vif [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:9b:79,bridge_name='br-int',has_traffic_filtering=True,id=360af8e5-9c2d-468f-a325-ced8745d90f1,network=Network(0501a2fa-d9aa-43e0-a6c7-2ea169228252),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap360af8e5-9c')
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.602 183181 INFO nova.virt.libvirt.driver [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Deleting instance files /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b_del
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.604 183181 INFO nova.virt.libvirt.driver [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Deletion of /var/lib/nova/instances/388082f6-0664-4bc2-844f-e9545548138b_del complete
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.704 183181 DEBUG nova.compute.manager [req-4d4d85fe-83ae-4dd3-a94a-7c6ff08ad019 req-eb0895b4-e082-424a-9d1a-247ac4ca470e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Received event network-vif-unplugged-360af8e5-9c2d-468f-a325-ced8745d90f1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.704 183181 DEBUG oslo_concurrency.lockutils [req-4d4d85fe-83ae-4dd3-a94a-7c6ff08ad019 req-eb0895b4-e082-424a-9d1a-247ac4ca470e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "388082f6-0664-4bc2-844f-e9545548138b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.704 183181 DEBUG oslo_concurrency.lockutils [req-4d4d85fe-83ae-4dd3-a94a-7c6ff08ad019 req-eb0895b4-e082-424a-9d1a-247ac4ca470e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.704 183181 DEBUG oslo_concurrency.lockutils [req-4d4d85fe-83ae-4dd3-a94a-7c6ff08ad019 req-eb0895b4-e082-424a-9d1a-247ac4ca470e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.705 183181 DEBUG nova.compute.manager [req-4d4d85fe-83ae-4dd3-a94a-7c6ff08ad019 req-eb0895b4-e082-424a-9d1a-247ac4ca470e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] No waiting events found dispatching network-vif-unplugged-360af8e5-9c2d-468f-a325-ced8745d90f1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:46:41 compute-0 nova_compute[183177]: 2026-01-26 19:46:41.705 183181 DEBUG nova.compute.manager [req-4d4d85fe-83ae-4dd3-a94a-7c6ff08ad019 req-eb0895b4-e082-424a-9d1a-247ac4ca470e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Received event network-vif-unplugged-360af8e5-9c2d-468f-a325-ced8745d90f1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:46:42 compute-0 nova_compute[183177]: 2026-01-26 19:46:42.118 183181 INFO nova.compute.manager [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Took 1.31 seconds to destroy the instance on the hypervisor.
Jan 26 19:46:42 compute-0 nova_compute[183177]: 2026-01-26 19:46:42.119 183181 DEBUG oslo.service.backend._eventlet.loopingcall [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 19:46:42 compute-0 nova_compute[183177]: 2026-01-26 19:46:42.119 183181 DEBUG nova.compute.manager [-] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 19:46:42 compute-0 nova_compute[183177]: 2026-01-26 19:46:42.119 183181 DEBUG nova.network.neutron [-] [instance: 388082f6-0664-4bc2-844f-e9545548138b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 19:46:42 compute-0 nova_compute[183177]: 2026-01-26 19:46:42.120 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:42 compute-0 nova_compute[183177]: 2026-01-26 19:46:42.537 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:46:43 compute-0 nova_compute[183177]: 2026-01-26 19:46:43.032 183181 DEBUG nova.compute.manager [req-57265f5e-d19d-434b-ba7a-eb83d8883d3f req-ed0c3a7a-cb6f-46f0-9d66-abd99905fe1d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Received event network-vif-deleted-360af8e5-9c2d-468f-a325-ced8745d90f1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:46:43 compute-0 nova_compute[183177]: 2026-01-26 19:46:43.033 183181 INFO nova.compute.manager [req-57265f5e-d19d-434b-ba7a-eb83d8883d3f req-ed0c3a7a-cb6f-46f0-9d66-abd99905fe1d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Neutron deleted interface 360af8e5-9c2d-468f-a325-ced8745d90f1; detaching it from the instance and deleting it from the info cache
Jan 26 19:46:43 compute-0 nova_compute[183177]: 2026-01-26 19:46:43.033 183181 DEBUG nova.network.neutron [req-57265f5e-d19d-434b-ba7a-eb83d8883d3f req-ed0c3a7a-cb6f-46f0-9d66-abd99905fe1d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:46:43 compute-0 nova_compute[183177]: 2026-01-26 19:46:43.463 183181 DEBUG nova.network.neutron [-] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:46:43 compute-0 nova_compute[183177]: 2026-01-26 19:46:43.540 183181 DEBUG nova.compute.manager [req-57265f5e-d19d-434b-ba7a-eb83d8883d3f req-ed0c3a7a-cb6f-46f0-9d66-abd99905fe1d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Detach interface failed, port_id=360af8e5-9c2d-468f-a325-ced8745d90f1, reason: Instance 388082f6-0664-4bc2-844f-e9545548138b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 19:46:43 compute-0 nova_compute[183177]: 2026-01-26 19:46:43.784 183181 DEBUG nova.compute.manager [req-8c1aa1df-7535-42d2-9a26-580970783777 req-942298fb-9f4c-4a54-99df-21401a7f2c98 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Received event network-vif-unplugged-360af8e5-9c2d-468f-a325-ced8745d90f1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:46:43 compute-0 nova_compute[183177]: 2026-01-26 19:46:43.785 183181 DEBUG oslo_concurrency.lockutils [req-8c1aa1df-7535-42d2-9a26-580970783777 req-942298fb-9f4c-4a54-99df-21401a7f2c98 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "388082f6-0664-4bc2-844f-e9545548138b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:43 compute-0 nova_compute[183177]: 2026-01-26 19:46:43.785 183181 DEBUG oslo_concurrency.lockutils [req-8c1aa1df-7535-42d2-9a26-580970783777 req-942298fb-9f4c-4a54-99df-21401a7f2c98 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:43 compute-0 nova_compute[183177]: 2026-01-26 19:46:43.785 183181 DEBUG oslo_concurrency.lockutils [req-8c1aa1df-7535-42d2-9a26-580970783777 req-942298fb-9f4c-4a54-99df-21401a7f2c98 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:43 compute-0 nova_compute[183177]: 2026-01-26 19:46:43.785 183181 DEBUG nova.compute.manager [req-8c1aa1df-7535-42d2-9a26-580970783777 req-942298fb-9f4c-4a54-99df-21401a7f2c98 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] No waiting events found dispatching network-vif-unplugged-360af8e5-9c2d-468f-a325-ced8745d90f1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:46:43 compute-0 nova_compute[183177]: 2026-01-26 19:46:43.785 183181 DEBUG nova.compute.manager [req-8c1aa1df-7535-42d2-9a26-580970783777 req-942298fb-9f4c-4a54-99df-21401a7f2c98 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Received event network-vif-unplugged-360af8e5-9c2d-468f-a325-ced8745d90f1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:46:43 compute-0 nova_compute[183177]: 2026-01-26 19:46:43.970 183181 INFO nova.compute.manager [-] [instance: 388082f6-0664-4bc2-844f-e9545548138b] Took 1.85 seconds to deallocate network for instance.
Jan 26 19:46:44 compute-0 nova_compute[183177]: 2026-01-26 19:46:44.486 183181 DEBUG oslo_concurrency.lockutils [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:44 compute-0 nova_compute[183177]: 2026-01-26 19:46:44.486 183181 DEBUG oslo_concurrency.lockutils [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:44 compute-0 nova_compute[183177]: 2026-01-26 19:46:44.517 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:44 compute-0 nova_compute[183177]: 2026-01-26 19:46:44.573 183181 DEBUG nova.scheduler.client.report [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Refreshing inventories for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 19:46:44 compute-0 nova_compute[183177]: 2026-01-26 19:46:44.599 183181 DEBUG nova.scheduler.client.report [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Updating ProviderTree inventory for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 19:46:44 compute-0 nova_compute[183177]: 2026-01-26 19:46:44.600 183181 DEBUG nova.compute.provider_tree [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 19:46:44 compute-0 nova_compute[183177]: 2026-01-26 19:46:44.623 183181 DEBUG nova.scheduler.client.report [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Refreshing aggregate associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 19:46:44 compute-0 nova_compute[183177]: 2026-01-26 19:46:44.655 183181 DEBUG nova.scheduler.client.report [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Refreshing trait associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_SCSI,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 19:46:44 compute-0 nova_compute[183177]: 2026-01-26 19:46:44.704 183181 DEBUG nova.compute.provider_tree [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:46:45 compute-0 nova_compute[183177]: 2026-01-26 19:46:45.213 183181 DEBUG nova.scheduler.client.report [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:46:45 compute-0 nova_compute[183177]: 2026-01-26 19:46:45.724 183181 DEBUG oslo_concurrency.lockutils [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.238s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:45 compute-0 nova_compute[183177]: 2026-01-26 19:46:45.755 183181 INFO nova.scheduler.client.report [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Deleted allocations for instance 388082f6-0664-4bc2-844f-e9545548138b
Jan 26 19:46:46 compute-0 nova_compute[183177]: 2026-01-26 19:46:46.598 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:46 compute-0 nova_compute[183177]: 2026-01-26 19:46:46.793 183181 DEBUG oslo_concurrency.lockutils [None req-f916245f-3f95-411a-af6a-3436e00cfa4f 2f579cdd19584229bbf4f240effa28f3 315dd5c96f24487b9b621d7237bc35ed - - default default] Lock "388082f6-0664-4bc2-844f-e9545548138b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.535s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:49 compute-0 nova_compute[183177]: 2026-01-26 19:46:49.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:46:49 compute-0 nova_compute[183177]: 2026-01-26 19:46:49.517 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:51 compute-0 nova_compute[183177]: 2026-01-26 19:46:51.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:46:51 compute-0 nova_compute[183177]: 2026-01-26 19:46:51.601 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:52 compute-0 nova_compute[183177]: 2026-01-26 19:46:52.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:46:52 compute-0 nova_compute[183177]: 2026-01-26 19:46:52.665 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:52 compute-0 nova_compute[183177]: 2026-01-26 19:46:52.666 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:52 compute-0 nova_compute[183177]: 2026-01-26 19:46:52.666 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:52 compute-0 nova_compute[183177]: 2026-01-26 19:46:52.667 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:46:52 compute-0 nova_compute[183177]: 2026-01-26 19:46:52.905 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:46:52 compute-0 nova_compute[183177]: 2026-01-26 19:46:52.906 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:46:52 compute-0 nova_compute[183177]: 2026-01-26 19:46:52.935 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:46:52 compute-0 nova_compute[183177]: 2026-01-26 19:46:52.936 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5794MB free_disk=73.09881591796875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:46:52 compute-0 nova_compute[183177]: 2026-01-26 19:46:52.937 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:46:52 compute-0 nova_compute[183177]: 2026-01-26 19:46:52.937 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:46:53 compute-0 nova_compute[183177]: 2026-01-26 19:46:53.989 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:46:53 compute-0 nova_compute[183177]: 2026-01-26 19:46:53.990 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:46:52 up  1:11,  0 user,  load average: 0.17, 0.25, 0.38\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:46:54 compute-0 nova_compute[183177]: 2026-01-26 19:46:54.018 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:46:54 compute-0 nova_compute[183177]: 2026-01-26 19:46:54.523 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:54 compute-0 nova_compute[183177]: 2026-01-26 19:46:54.527 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:46:55 compute-0 nova_compute[183177]: 2026-01-26 19:46:55.043 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:46:55 compute-0 nova_compute[183177]: 2026-01-26 19:46:55.044 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:46:55 compute-0 nova_compute[183177]: 2026-01-26 19:46:55.173 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:56 compute-0 nova_compute[183177]: 2026-01-26 19:46:56.603 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:57 compute-0 nova_compute[183177]: 2026-01-26 19:46:57.040 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:46:57 compute-0 nova_compute[183177]: 2026-01-26 19:46:57.040 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:46:57 compute-0 nova_compute[183177]: 2026-01-26 19:46:57.040 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:46:57 compute-0 nova_compute[183177]: 2026-01-26 19:46:57.041 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:46:57 compute-0 nova_compute[183177]: 2026-01-26 19:46:57.041 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:46:59 compute-0 nova_compute[183177]: 2026-01-26 19:46:59.553 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:46:59 compute-0 podman[192499]: time="2026-01-26T19:46:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:46:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:46:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:46:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:46:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Jan 26 19:47:00 compute-0 nova_compute[183177]: 2026-01-26 19:47:00.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:47:01 compute-0 openstack_network_exporter[195363]: ERROR   19:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:47:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:47:01 compute-0 openstack_network_exporter[195363]: ERROR   19:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:47:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:47:01 compute-0 nova_compute[183177]: 2026-01-26 19:47:01.605 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:02 compute-0 podman[208338]: 2026-01-26 19:47:02.424996745 +0000 UTC m=+0.164212470 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 19:47:04 compute-0 podman[208365]: 2026-01-26 19:47:04.302566639 +0000 UTC m=+0.053002713 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, release=1755695350)
Jan 26 19:47:04 compute-0 podman[208366]: 2026-01-26 19:47:04.320245132 +0000 UTC m=+0.057388912 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 19:47:04 compute-0 nova_compute[183177]: 2026-01-26 19:47:04.556 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:06 compute-0 nova_compute[183177]: 2026-01-26 19:47:06.607 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:09 compute-0 nova_compute[183177]: 2026-01-26 19:47:09.557 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:09 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:09.605 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:37:4d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-553279b9-0d31-4b18-a3ac-a6b6861ed8e4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-553279b9-0d31-4b18-a3ac-a6b6861ed8e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89f1160560c94c4187f6209fb2a3b2be', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8715d70-6668-4530-aa7b-2fc8a16e0e98, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ac8b5e73-9748-4ef4-b8df-5625dd1382b1) old=Port_Binding(mac=['fa:16:3e:6d:37:4d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-553279b9-0d31-4b18-a3ac-a6b6861ed8e4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-553279b9-0d31-4b18-a3ac-a6b6861ed8e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89f1160560c94c4187f6209fb2a3b2be', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:47:09 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:09.607 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ac8b5e73-9748-4ef4-b8df-5625dd1382b1 in datapath 553279b9-0d31-4b18-a3ac-a6b6861ed8e4 updated
Jan 26 19:47:09 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:09.608 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 553279b9-0d31-4b18-a3ac-a6b6861ed8e4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:47:09 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:09.609 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[bad0c988-8b4c-4376-9d16-92dd971bdc86]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:47:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:10.778 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:47:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:10.779 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:47:10 compute-0 nova_compute[183177]: 2026-01-26 19:47:10.778 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:11 compute-0 podman[208408]: 2026-01-26 19:47:11.29778815 +0000 UTC m=+0.053820697 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 19:47:11 compute-0 nova_compute[183177]: 2026-01-26 19:47:11.609 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:14 compute-0 nova_compute[183177]: 2026-01-26 19:47:14.559 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:16 compute-0 nova_compute[183177]: 2026-01-26 19:47:16.611 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:16 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:16.780 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:47:19 compute-0 nova_compute[183177]: 2026-01-26 19:47:19.560 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:19.955 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:0f:4d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-5b8ebadf-740c-4904-8527-561a30f2b3d2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b8ebadf-740c-4904-8527-561a30f2b3d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e13bdd90f7e4f91a95bbd32d3d83249', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=da261b04-59b9-4a84-b39d-5803948077c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e152de4c-0eaa-4c99-8abd-68bf56d32576) old=Port_Binding(mac=['fa:16:3e:f7:0f:4d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-5b8ebadf-740c-4904-8527-561a30f2b3d2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b8ebadf-740c-4904-8527-561a30f2b3d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e13bdd90f7e4f91a95bbd32d3d83249', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:47:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:19.956 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e152de4c-0eaa-4c99-8abd-68bf56d32576 in datapath 5b8ebadf-740c-4904-8527-561a30f2b3d2 updated
Jan 26 19:47:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:19.957 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b8ebadf-740c-4904-8527-561a30f2b3d2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:47:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:19.958 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[68c99476-a08e-4c75-9a17-c45262beb446]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:47:21 compute-0 nova_compute[183177]: 2026-01-26 19:47:21.613 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:24.049 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:47:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:24.050 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:47:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:24.050 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:47:24 compute-0 nova_compute[183177]: 2026-01-26 19:47:24.594 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:26 compute-0 nova_compute[183177]: 2026-01-26 19:47:26.615 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:29 compute-0 nova_compute[183177]: 2026-01-26 19:47:29.652 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:29 compute-0 podman[192499]: time="2026-01-26T19:47:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:47:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:47:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:47:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:47:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 26 19:47:29 compute-0 ovn_controller[95396]: 2026-01-26T19:47:29Z|00107|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 26 19:47:31 compute-0 openstack_network_exporter[195363]: ERROR   19:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:47:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:47:31 compute-0 openstack_network_exporter[195363]: ERROR   19:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:47:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:47:31 compute-0 nova_compute[183177]: 2026-01-26 19:47:31.617 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:33 compute-0 podman[208434]: 2026-01-26 19:47:33.345431819 +0000 UTC m=+0.090947041 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 19:47:34 compute-0 nova_compute[183177]: 2026-01-26 19:47:34.655 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:35 compute-0 podman[208461]: 2026-01-26 19:47:35.331432203 +0000 UTC m=+0.077144403 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=openstack_network_exporter)
Jan 26 19:47:35 compute-0 podman[208462]: 2026-01-26 19:47:35.361437815 +0000 UTC m=+0.100510913 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 19:47:36 compute-0 nova_compute[183177]: 2026-01-26 19:47:36.620 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:39 compute-0 nova_compute[183177]: 2026-01-26 19:47:39.657 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:41 compute-0 nova_compute[183177]: 2026-01-26 19:47:41.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:47:41 compute-0 nova_compute[183177]: 2026-01-26 19:47:41.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 19:47:41 compute-0 nova_compute[183177]: 2026-01-26 19:47:41.621 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:41 compute-0 nova_compute[183177]: 2026-01-26 19:47:41.723 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 19:47:42 compute-0 podman[208500]: 2026-01-26 19:47:42.347494644 +0000 UTC m=+0.082177332 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 19:47:44 compute-0 nova_compute[183177]: 2026-01-26 19:47:44.685 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:46 compute-0 nova_compute[183177]: 2026-01-26 19:47:46.623 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:49 compute-0 nova_compute[183177]: 2026-01-26 19:47:49.688 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:50 compute-0 nova_compute[183177]: 2026-01-26 19:47:50.724 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:47:51 compute-0 nova_compute[183177]: 2026-01-26 19:47:51.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:47:51 compute-0 sshd-session[208524]: Invalid user hduser from 193.32.162.151 port 38058
Jan 26 19:47:51 compute-0 nova_compute[183177]: 2026-01-26 19:47:51.625 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:51 compute-0 sshd-session[208524]: Connection closed by invalid user hduser 193.32.162.151 port 38058 [preauth]
Jan 26 19:47:54 compute-0 nova_compute[183177]: 2026-01-26 19:47:54.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:47:54 compute-0 nova_compute[183177]: 2026-01-26 19:47:54.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:47:54 compute-0 nova_compute[183177]: 2026-01-26 19:47:54.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:47:54 compute-0 nova_compute[183177]: 2026-01-26 19:47:54.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:47:54 compute-0 nova_compute[183177]: 2026-01-26 19:47:54.679 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:47:54 compute-0 nova_compute[183177]: 2026-01-26 19:47:54.679 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:47:54 compute-0 nova_compute[183177]: 2026-01-26 19:47:54.680 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:47:54 compute-0 nova_compute[183177]: 2026-01-26 19:47:54.680 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:47:54 compute-0 nova_compute[183177]: 2026-01-26 19:47:54.690 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:55 compute-0 nova_compute[183177]: 2026-01-26 19:47:55.012 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:47:55 compute-0 nova_compute[183177]: 2026-01-26 19:47:55.014 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:47:55 compute-0 nova_compute[183177]: 2026-01-26 19:47:55.043 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:47:55 compute-0 nova_compute[183177]: 2026-01-26 19:47:55.045 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5801MB free_disk=73.09879684448242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:47:55 compute-0 nova_compute[183177]: 2026-01-26 19:47:55.045 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:47:55 compute-0 nova_compute[183177]: 2026-01-26 19:47:55.046 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:47:56 compute-0 nova_compute[183177]: 2026-01-26 19:47:56.599 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:47:56 compute-0 nova_compute[183177]: 2026-01-26 19:47:56.600 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:47:55 up  1:12,  0 user,  load average: 0.06, 0.20, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:47:56 compute-0 nova_compute[183177]: 2026-01-26 19:47:56.621 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:47:56 compute-0 nova_compute[183177]: 2026-01-26 19:47:56.627 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:57 compute-0 nova_compute[183177]: 2026-01-26 19:47:57.268 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:47:57 compute-0 nova_compute[183177]: 2026-01-26 19:47:57.916 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:47:57 compute-0 nova_compute[183177]: 2026-01-26 19:47:57.916 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.870s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:47:57 compute-0 nova_compute[183177]: 2026-01-26 19:47:57.917 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:47:57 compute-0 nova_compute[183177]: 2026-01-26 19:47:57.917 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 19:47:59 compute-0 nova_compute[183177]: 2026-01-26 19:47:59.692 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:47:59 compute-0 podman[192499]: time="2026-01-26T19:47:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:47:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:47:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:47:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:59.748 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:f3:1e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '462f3c2b8ab64156915d1fc496fd2e53', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bd82498-4e86-414e-9a6d-c217ab314723, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b6204483-01c7-496e-85f4-be5264700777) old=Port_Binding(mac=['fa:16:3e:36:f3:1e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '462f3c2b8ab64156915d1fc496fd2e53', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:47:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:59.750 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b6204483-01c7-496e-85f4-be5264700777 in datapath 147aa3ea-66ec-4250-9408-de2c9a19f4fa updated
Jan 26 19:47:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:59.752 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 147aa3ea-66ec-4250-9408-de2c9a19f4fa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:47:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:47:59.752 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e76c0b-2305-4e7a-85d8-6bfe4a8e993d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:47:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:47:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 26 19:48:00 compute-0 nova_compute[183177]: 2026-01-26 19:48:00.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:48:00 compute-0 nova_compute[183177]: 2026-01-26 19:48:00.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:48:00 compute-0 nova_compute[183177]: 2026-01-26 19:48:00.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:48:00 compute-0 nova_compute[183177]: 2026-01-26 19:48:00.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:48:01 compute-0 openstack_network_exporter[195363]: ERROR   19:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:48:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:48:01 compute-0 openstack_network_exporter[195363]: ERROR   19:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:48:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:48:01 compute-0 nova_compute[183177]: 2026-01-26 19:48:01.629 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:01 compute-0 nova_compute[183177]: 2026-01-26 19:48:01.933 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:48:04 compute-0 podman[208527]: 2026-01-26 19:48:04.466804557 +0000 UTC m=+0.199918017 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 19:48:04 compute-0 nova_compute[183177]: 2026-01-26 19:48:04.694 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:06 compute-0 podman[208552]: 2026-01-26 19:48:06.330181172 +0000 UTC m=+0.074592495 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 26 19:48:06 compute-0 podman[208551]: 2026-01-26 19:48:06.329951465 +0000 UTC m=+0.074246165 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., container_name=openstack_network_exporter)
Jan 26 19:48:06 compute-0 nova_compute[183177]: 2026-01-26 19:48:06.631 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:08.600 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:30:16 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cc957908-22c5-40fe-82cc-4fb45a2610d5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc957908-22c5-40fe-82cc-4fb45a2610d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f91153215558476397ac0fa698028694', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e4dd38c-42d3-4f80-bbf3-2e45d66b2023, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fd6afee7-8462-46b1-90f5-9c77ea2e6865) old=Port_Binding(mac=['fa:16:3e:2a:30:16'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-cc957908-22c5-40fe-82cc-4fb45a2610d5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc957908-22c5-40fe-82cc-4fb45a2610d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f91153215558476397ac0fa698028694', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:48:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:08.602 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fd6afee7-8462-46b1-90f5-9c77ea2e6865 in datapath cc957908-22c5-40fe-82cc-4fb45a2610d5 updated
Jan 26 19:48:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:08.603 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc957908-22c5-40fe-82cc-4fb45a2610d5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:48:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:08.604 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[61111e34-83c9-4791-ac1a-55e1acf12ccc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:09 compute-0 nova_compute[183177]: 2026-01-26 19:48:09.696 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:11 compute-0 nova_compute[183177]: 2026-01-26 19:48:11.633 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:12 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 19:48:12 compute-0 podman[208590]: 2026-01-26 19:48:12.813466201 +0000 UTC m=+0.064790536 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:48:14 compute-0 nova_compute[183177]: 2026-01-26 19:48:14.697 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:16 compute-0 nova_compute[183177]: 2026-01-26 19:48:16.635 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:17 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:17.125 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:48:17 compute-0 nova_compute[183177]: 2026-01-26 19:48:17.126 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:17 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:17.127 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:48:19 compute-0 nova_compute[183177]: 2026-01-26 19:48:19.698 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:21 compute-0 nova_compute[183177]: 2026-01-26 19:48:21.672 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:22.130 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:48:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:24.051 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:48:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:24.051 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:48:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:24.051 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:48:24 compute-0 nova_compute[183177]: 2026-01-26 19:48:24.701 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:25 compute-0 nova_compute[183177]: 2026-01-26 19:48:25.745 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:48:25 compute-0 nova_compute[183177]: 2026-01-26 19:48:25.745 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:48:26 compute-0 nova_compute[183177]: 2026-01-26 19:48:26.252 183181 DEBUG nova.compute.manager [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 19:48:26 compute-0 nova_compute[183177]: 2026-01-26 19:48:26.718 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:26 compute-0 nova_compute[183177]: 2026-01-26 19:48:26.838 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:48:26 compute-0 nova_compute[183177]: 2026-01-26 19:48:26.839 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:48:26 compute-0 nova_compute[183177]: 2026-01-26 19:48:26.853 183181 DEBUG nova.virt.hardware [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 19:48:26 compute-0 nova_compute[183177]: 2026-01-26 19:48:26.854 183181 INFO nova.compute.claims [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Claim successful on node compute-0.ctlplane.example.com
Jan 26 19:48:28 compute-0 nova_compute[183177]: 2026-01-26 19:48:28.915 183181 DEBUG nova.compute.provider_tree [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:48:29 compute-0 nova_compute[183177]: 2026-01-26 19:48:29.423 183181 DEBUG nova.scheduler.client.report [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:48:29 compute-0 nova_compute[183177]: 2026-01-26 19:48:29.703 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:29 compute-0 podman[192499]: time="2026-01-26T19:48:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:48:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:48:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:48:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:48:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 26 19:48:29 compute-0 nova_compute[183177]: 2026-01-26 19:48:29.993 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.154s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:48:29 compute-0 nova_compute[183177]: 2026-01-26 19:48:29.994 183181 DEBUG nova.compute.manager [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 19:48:30 compute-0 nova_compute[183177]: 2026-01-26 19:48:30.672 183181 DEBUG nova.compute.manager [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 19:48:30 compute-0 nova_compute[183177]: 2026-01-26 19:48:30.673 183181 DEBUG nova.network.neutron [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 19:48:30 compute-0 nova_compute[183177]: 2026-01-26 19:48:30.674 183181 WARNING neutronclient.v2_0.client [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:48:30 compute-0 nova_compute[183177]: 2026-01-26 19:48:30.674 183181 WARNING neutronclient.v2_0.client [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:48:31 compute-0 nova_compute[183177]: 2026-01-26 19:48:31.193 183181 INFO nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 19:48:31 compute-0 openstack_network_exporter[195363]: ERROR   19:48:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:48:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:48:31 compute-0 openstack_network_exporter[195363]: ERROR   19:48:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:48:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:48:31 compute-0 nova_compute[183177]: 2026-01-26 19:48:31.720 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:31 compute-0 nova_compute[183177]: 2026-01-26 19:48:31.788 183181 DEBUG nova.compute.manager [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.159 183181 DEBUG nova.network.neutron [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Successfully created port: e510673a-2b92-49f7-83b8-aba8a3f87ee7 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.690 183181 DEBUG nova.network.neutron [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Successfully updated port: e510673a-2b92-49f7-83b8-aba8a3f87ee7 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.765 183181 DEBUG nova.compute.manager [req-101d4977-9575-40ef-8e3f-e9762fc8b553 req-2639c107-c1cb-43f9-9b3a-4977b3c5190e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-changed-e510673a-2b92-49f7-83b8-aba8a3f87ee7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.766 183181 DEBUG nova.compute.manager [req-101d4977-9575-40ef-8e3f-e9762fc8b553 req-2639c107-c1cb-43f9-9b3a-4977b3c5190e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Refreshing instance network info cache due to event network-changed-e510673a-2b92-49f7-83b8-aba8a3f87ee7. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.766 183181 DEBUG oslo_concurrency.lockutils [req-101d4977-9575-40ef-8e3f-e9762fc8b553 req-2639c107-c1cb-43f9-9b3a-4977b3c5190e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.767 183181 DEBUG oslo_concurrency.lockutils [req-101d4977-9575-40ef-8e3f-e9762fc8b553 req-2639c107-c1cb-43f9-9b3a-4977b3c5190e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.767 183181 DEBUG nova.network.neutron [req-101d4977-9575-40ef-8e3f-e9762fc8b553 req-2639c107-c1cb-43f9-9b3a-4977b3c5190e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Refreshing network info cache for port e510673a-2b92-49f7-83b8-aba8a3f87ee7 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.808 183181 DEBUG nova.compute.manager [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.810 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.810 183181 INFO nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Creating image(s)
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.811 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.812 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.813 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.815 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.823 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.825 183181 DEBUG oslo_concurrency.processutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.914 183181 DEBUG oslo_concurrency.processutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.915 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.916 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.917 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.923 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.924 183181 DEBUG oslo_concurrency.processutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.986 183181 DEBUG oslo_concurrency.processutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:48:32 compute-0 nova_compute[183177]: 2026-01-26 19:48:32.987 183181 DEBUG oslo_concurrency.processutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.023 183181 DEBUG oslo_concurrency.processutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.025 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.026 183181 DEBUG oslo_concurrency.processutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.097 183181 DEBUG oslo_concurrency.processutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.099 183181 DEBUG nova.virt.disk.api [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Checking if we can resize image /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.099 183181 DEBUG oslo_concurrency.processutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.161 183181 DEBUG oslo_concurrency.processutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.161 183181 DEBUG nova.virt.disk.api [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Cannot resize image /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.162 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.162 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Ensure instance console log exists: /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.162 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.163 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.163 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.197 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "refresh_cache-6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.273 183181 WARNING neutronclient.v2_0.client [req-101d4977-9575-40ef-8e3f-e9762fc8b553 req-2639c107-c1cb-43f9-9b3a-4977b3c5190e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.610 183181 DEBUG nova.network.neutron [req-101d4977-9575-40ef-8e3f-e9762fc8b553 req-2639c107-c1cb-43f9-9b3a-4977b3c5190e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:48:33 compute-0 nova_compute[183177]: 2026-01-26 19:48:33.854 183181 DEBUG nova.network.neutron [req-101d4977-9575-40ef-8e3f-e9762fc8b553 req-2639c107-c1cb-43f9-9b3a-4977b3c5190e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:48:34 compute-0 nova_compute[183177]: 2026-01-26 19:48:34.361 183181 DEBUG oslo_concurrency.lockutils [req-101d4977-9575-40ef-8e3f-e9762fc8b553 req-2639c107-c1cb-43f9-9b3a-4977b3c5190e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:48:34 compute-0 nova_compute[183177]: 2026-01-26 19:48:34.362 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquired lock "refresh_cache-6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:48:34 compute-0 nova_compute[183177]: 2026-01-26 19:48:34.363 183181 DEBUG nova.network.neutron [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:48:34 compute-0 nova_compute[183177]: 2026-01-26 19:48:34.706 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:35 compute-0 podman[208631]: 2026-01-26 19:48:35.361939326 +0000 UTC m=+0.110553159 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Jan 26 19:48:35 compute-0 nova_compute[183177]: 2026-01-26 19:48:35.617 183181 DEBUG nova.network.neutron [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:48:36 compute-0 nova_compute[183177]: 2026-01-26 19:48:36.045 183181 WARNING neutronclient.v2_0.client [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:48:36 compute-0 nova_compute[183177]: 2026-01-26 19:48:36.718 183181 DEBUG nova.network.neutron [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Updating instance_info_cache with network_info: [{"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:48:36 compute-0 nova_compute[183177]: 2026-01-26 19:48:36.733 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.236 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Releasing lock "refresh_cache-6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.237 183181 DEBUG nova.compute.manager [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Instance network_info: |[{"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.241 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Start _get_guest_xml network_info=[{"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.248 183181 WARNING nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.250 183181 DEBUG nova.virt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1019962861', uuid='6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d'), owner=OwnerMeta(userid='41beaf3eb21246ea94c2701984cf4279', username='tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin', projectid='f91153215558476397ac0fa698028694', projectname='tempest-TestExecuteHostMaintenanceStrategy-651747916'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769456917.2500198) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.258 183181 DEBUG nova.virt.libvirt.host [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.259 183181 DEBUG nova.virt.libvirt.host [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.263 183181 DEBUG nova.virt.libvirt.host [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.264 183181 DEBUG nova.virt.libvirt.host [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.267 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.268 183181 DEBUG nova.virt.hardware [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.268 183181 DEBUG nova.virt.hardware [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.269 183181 DEBUG nova.virt.hardware [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.269 183181 DEBUG nova.virt.hardware [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.269 183181 DEBUG nova.virt.hardware [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.270 183181 DEBUG nova.virt.hardware [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.270 183181 DEBUG nova.virt.hardware [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.270 183181 DEBUG nova.virt.hardware [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.271 183181 DEBUG nova.virt.hardware [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.271 183181 DEBUG nova.virt.hardware [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.272 183181 DEBUG nova.virt.hardware [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.279 183181 DEBUG nova.virt.libvirt.vif [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:48:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1019962861',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1019962861',id=12,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f91153215558476397ac0fa698028694',ramdisk_id='',reservation_id='r-fji1gb34',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-651747916',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:48:31Z,user_data=None,user_id='41beaf3eb21246ea94c2701984cf4279',uuid=6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.280 183181 DEBUG nova.network.os_vif_util [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Converting VIF {"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.281 183181 DEBUG nova.network.os_vif_util [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:e5:47,bridge_name='br-int',has_traffic_filtering=True,id=e510673a-2b92-49f7-83b8-aba8a3f87ee7,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape510673a-2b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.282 183181 DEBUG nova.objects.instance [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:48:37 compute-0 podman[208657]: 2026-01-26 19:48:37.325555208 +0000 UTC m=+0.072385444 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Jan 26 19:48:37 compute-0 podman[208658]: 2026-01-26 19:48:37.375542417 +0000 UTC m=+0.111461724 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.814 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] End _get_guest_xml xml=<domain type="kvm">
Jan 26 19:48:37 compute-0 nova_compute[183177]:   <uuid>6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d</uuid>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   <name>instance-0000000c</name>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1019962861</nova:name>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:48:37</nova:creationTime>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:48:37 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:48:37 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:user uuid="41beaf3eb21246ea94c2701984cf4279">tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin</nova:user>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:project uuid="f91153215558476397ac0fa698028694">tempest-TestExecuteHostMaintenanceStrategy-651747916</nova:project>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         <nova:port uuid="e510673a-2b92-49f7-83b8-aba8a3f87ee7">
Jan 26 19:48:37 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <system>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <entry name="serial">6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d</entry>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <entry name="uuid">6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d</entry>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     </system>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   <os>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   </os>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   <features>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   </features>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk.config"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:15:e5:47"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <target dev="tape510673a-2b"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/console.log" append="off"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <video>
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     </video>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:48:37 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:48:37 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:48:37 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:48:37 compute-0 nova_compute[183177]: </domain>
Jan 26 19:48:37 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.817 183181 DEBUG nova.compute.manager [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Preparing to wait for external event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.818 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.819 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.819 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.821 183181 DEBUG nova.virt.libvirt.vif [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:48:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1019962861',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1019962861',id=12,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f91153215558476397ac0fa698028694',ramdisk_id='',reservation_id='r-fji1gb34',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-651747916',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:48:31Z,user_data=None,user_id='41beaf3eb21246ea94c2701984cf4279',uuid=6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.822 183181 DEBUG nova.network.os_vif_util [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Converting VIF {"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.825 183181 DEBUG nova.network.os_vif_util [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:e5:47,bridge_name='br-int',has_traffic_filtering=True,id=e510673a-2b92-49f7-83b8-aba8a3f87ee7,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape510673a-2b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.826 183181 DEBUG os_vif [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:e5:47,bridge_name='br-int',has_traffic_filtering=True,id=e510673a-2b92-49f7-83b8-aba8a3f87ee7,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape510673a-2b') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.828 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.829 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.830 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.832 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.832 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '7af65552-86b3-570d-906a-2871d979597f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.881 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.883 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.891 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.892 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape510673a-2b, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.893 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tape510673a-2b, col_values=(('qos', UUID('0ac5e89a-a5d3-453a-a728-52b637227990')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.893 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tape510673a-2b, col_values=(('external_ids', {'iface-id': 'e510673a-2b92-49f7-83b8-aba8a3f87ee7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:e5:47', 'vm-uuid': '6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.896 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:37 compute-0 NetworkManager[55489]: <info>  [1769456917.8978] manager: (tape510673a-2b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.899 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.907 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:37 compute-0 nova_compute[183177]: 2026-01-26 19:48:37.908 183181 INFO os_vif [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:e5:47,bridge_name='br-int',has_traffic_filtering=True,id=e510673a-2b92-49f7-83b8-aba8a3f87ee7,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape510673a-2b')
Jan 26 19:48:39 compute-0 nova_compute[183177]: 2026-01-26 19:48:39.456 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:48:39 compute-0 nova_compute[183177]: 2026-01-26 19:48:39.457 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:48:39 compute-0 nova_compute[183177]: 2026-01-26 19:48:39.457 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] No VIF found with MAC fa:16:3e:15:e5:47, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 19:48:39 compute-0 nova_compute[183177]: 2026-01-26 19:48:39.458 183181 INFO nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Using config drive
Jan 26 19:48:39 compute-0 nova_compute[183177]: 2026-01-26 19:48:39.709 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:39 compute-0 nova_compute[183177]: 2026-01-26 19:48:39.971 183181 WARNING neutronclient.v2_0.client [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:48:40 compute-0 nova_compute[183177]: 2026-01-26 19:48:40.715 183181 INFO nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Creating config drive at /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk.config
Jan 26 19:48:40 compute-0 nova_compute[183177]: 2026-01-26 19:48:40.727 183181 DEBUG oslo_concurrency.processutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpv2z3fe4w execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:48:40 compute-0 nova_compute[183177]: 2026-01-26 19:48:40.873 183181 DEBUG oslo_concurrency.processutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpv2z3fe4w" returned: 0 in 0.145s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:48:40 compute-0 kernel: tape510673a-2b: entered promiscuous mode
Jan 26 19:48:40 compute-0 NetworkManager[55489]: <info>  [1769456920.9911] manager: (tape510673a-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Jan 26 19:48:40 compute-0 ovn_controller[95396]: 2026-01-26T19:48:40Z|00108|binding|INFO|Claiming lport e510673a-2b92-49f7-83b8-aba8a3f87ee7 for this chassis.
Jan 26 19:48:40 compute-0 ovn_controller[95396]: 2026-01-26T19:48:40Z|00109|binding|INFO|e510673a-2b92-49f7-83b8-aba8a3f87ee7: Claiming fa:16:3e:15:e5:47 10.100.0.6
Jan 26 19:48:40 compute-0 nova_compute[183177]: 2026-01-26 19:48:40.994 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:40.999 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.006 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.023 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:e5:47 10.100.0.6'], port_security=['fa:16:3e:15:e5:47 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f91153215558476397ac0fa698028694', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dcc3a434-5bc5-4cb2-8878-ad4556ff41ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bd82498-4e86-414e-9a6d-c217ab314723, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=e510673a-2b92-49f7-83b8-aba8a3f87ee7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.024 104672 INFO neutron.agent.ovn.metadata.agent [-] Port e510673a-2b92-49f7-83b8-aba8a3f87ee7 in datapath 147aa3ea-66ec-4250-9408-de2c9a19f4fa bound to our chassis
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.026 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 147aa3ea-66ec-4250-9408-de2c9a19f4fa
Jan 26 19:48:41 compute-0 systemd-udevd[208717]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.048 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[813d66ed-0dc5-4cde-9cba-2e7cdd275265]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.049 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap147aa3ea-61 in ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.060 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap147aa3ea-60 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.061 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[cd7a651d-136b-4780-b8e5-71b2083c3350]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.063 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[dc0371a2-a789-493b-98bc-ed26bab98bdd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 systemd-machined[154465]: New machine qemu-9-instance-0000000c.
Jan 26 19:48:41 compute-0 NetworkManager[55489]: <info>  [1769456921.0790] device (tape510673a-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:48:41 compute-0 NetworkManager[55489]: <info>  [1769456921.0815] device (tape510673a-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.081 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[e9e914b8-fec0-40f5-a95b-d3cdfb7f2691]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000c.
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.099 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[95ed7767-89e0-4c64-af21-46ec22b34164]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.100 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:41 compute-0 ovn_controller[95396]: 2026-01-26T19:48:41Z|00110|binding|INFO|Setting lport e510673a-2b92-49f7-83b8-aba8a3f87ee7 ovn-installed in OVS
Jan 26 19:48:41 compute-0 ovn_controller[95396]: 2026-01-26T19:48:41Z|00111|binding|INFO|Setting lport e510673a-2b92-49f7-83b8-aba8a3f87ee7 up in Southbound
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.102 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.152 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ea7f08-21b9-46a1-a667-0f67f0988978]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.157 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b4ba9141-ba89-4d3f-b7bf-a8d3a54868e6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 systemd-udevd[208721]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:48:41 compute-0 NetworkManager[55489]: <info>  [1769456921.1603] manager: (tap147aa3ea-60): new Veth device (/org/freedesktop/NetworkManager/Devices/48)
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.195 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[b72e7f2a-ee18-4a5f-95e7-9fea60b68d51]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.200 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[6b7dcdb1-0472-4f58-86fe-650ef62369f6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 NetworkManager[55489]: <info>  [1769456921.2330] device (tap147aa3ea-60): carrier: link connected
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.241 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[0a5db1f1-bbcb-404c-b9dd-a3894c469457]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.265 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f06c281c-595b-40f4-a840-275b0d877ab4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap147aa3ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:f3:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438011, 'reachable_time': 38148, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208750, 'error': None, 'target': 'ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.285 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b78371-3ec1-45ca-984a-c6c124b523ba]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:f31e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438011, 'tstamp': 438011}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208751, 'error': None, 'target': 'ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.289 183181 DEBUG nova.compute.manager [req-c0f4196f-51c3-4f2f-a501-eb573c7743b6 req-e89235d3-5e46-469b-a0bc-37db2b3c7d0c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.289 183181 DEBUG oslo_concurrency.lockutils [req-c0f4196f-51c3-4f2f-a501-eb573c7743b6 req-e89235d3-5e46-469b-a0bc-37db2b3c7d0c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.289 183181 DEBUG oslo_concurrency.lockutils [req-c0f4196f-51c3-4f2f-a501-eb573c7743b6 req-e89235d3-5e46-469b-a0bc-37db2b3c7d0c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.290 183181 DEBUG oslo_concurrency.lockutils [req-c0f4196f-51c3-4f2f-a501-eb573c7743b6 req-e89235d3-5e46-469b-a0bc-37db2b3c7d0c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.290 183181 DEBUG nova.compute.manager [req-c0f4196f-51c3-4f2f-a501-eb573c7743b6 req-e89235d3-5e46-469b-a0bc-37db2b3c7d0c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Processing event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.312 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[ec812afa-a88c-4cdf-b70e-f9cf38e028f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap147aa3ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:f3:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438011, 'reachable_time': 38148, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208752, 'error': None, 'target': 'ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.350 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b2426bc3-7f98-4fb1-85e5-9642d0faa4bd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.437 183181 DEBUG nova.compute.manager [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.440 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[c8bfddcd-6681-4fee-a489-a0a5946755d4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.441 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.442 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap147aa3ea-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.443 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.443 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap147aa3ea-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.445 183181 INFO nova.virt.libvirt.driver [-] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Instance spawned successfully.
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.445 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 19:48:41 compute-0 NetworkManager[55489]: <info>  [1769456921.4469] manager: (tap147aa3ea-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 26 19:48:41 compute-0 kernel: tap147aa3ea-60: entered promiscuous mode
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.449 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.450 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap147aa3ea-60, col_values=(('external_ids', {'iface-id': 'b6204483-01c7-496e-85f4-be5264700777'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.453 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:41 compute-0 ovn_controller[95396]: 2026-01-26T19:48:41Z|00112|binding|INFO|Releasing lport b6204483-01c7-496e-85f4-be5264700777 from this chassis (sb_readonly=0)
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.479 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.481 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[79bac6de-b7ab-4ef4-87cf-80bd3c252b71]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.482 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.483 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.483 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 147aa3ea-66ec-4250-9408-de2c9a19f4fa disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.483 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.484 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf3af5b-dd6d-497f-9a5b-e8cfbe8e026a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.485 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.485 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[0c05b194-881b-46ae-8036-38edf71ed894]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.486 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: global
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-147aa3ea-66ec-4250-9408-de2c9a19f4fa
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID 147aa3ea-66ec-4250-9408-de2c9a19f4fa
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 19:48:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:48:41.487 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'env', 'PROCESS_TAG=haproxy-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/147aa3ea-66ec-4250-9408-de2c9a19f4fa.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 19:48:41 compute-0 podman[208791]: 2026-01-26 19:48:41.928478056 +0000 UTC m=+0.076721193 container create db27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest)
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.969 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.970 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.970 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.971 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.971 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:48:41 compute-0 nova_compute[183177]: 2026-01-26 19:48:41.972 183181 DEBUG nova.virt.libvirt.driver [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:48:41 compute-0 podman[208791]: 2026-01-26 19:48:41.886480785 +0000 UTC m=+0.034723982 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 19:48:41 compute-0 systemd[1]: Started libpod-conmon-db27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a.scope.
Jan 26 19:48:42 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:48:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02b823549fbb3030cad8d1f38c786e4dfe427f080cc81246ff9ca55108e2d64a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 19:48:42 compute-0 podman[208791]: 2026-01-26 19:48:42.057369316 +0000 UTC m=+0.205612473 container init db27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:48:42 compute-0 podman[208791]: 2026-01-26 19:48:42.07064423 +0000 UTC m=+0.218887357 container start db27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 19:48:42 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[208806]: [NOTICE]   (208810) : New worker (208812) forked
Jan 26 19:48:42 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[208806]: [NOTICE]   (208810) : Loading success.
Jan 26 19:48:42 compute-0 nova_compute[183177]: 2026-01-26 19:48:42.483 183181 INFO nova.compute.manager [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Took 9.67 seconds to spawn the instance on the hypervisor.
Jan 26 19:48:42 compute-0 nova_compute[183177]: 2026-01-26 19:48:42.484 183181 DEBUG nova.compute.manager [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 19:48:42 compute-0 nova_compute[183177]: 2026-01-26 19:48:42.897 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:43 compute-0 nova_compute[183177]: 2026-01-26 19:48:43.019 183181 INFO nova.compute.manager [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Took 16.26 seconds to build instance.
Jan 26 19:48:43 compute-0 podman[208821]: 2026-01-26 19:48:43.349077465 +0000 UTC m=+0.084345401 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:48:43 compute-0 nova_compute[183177]: 2026-01-26 19:48:43.356 183181 DEBUG nova.compute.manager [req-39e31e79-6ced-4ab4-8f53-22f4a0d6b5f6 req-101295de-08cb-42cb-aa8e-2ce55055da85 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:48:43 compute-0 nova_compute[183177]: 2026-01-26 19:48:43.356 183181 DEBUG oslo_concurrency.lockutils [req-39e31e79-6ced-4ab4-8f53-22f4a0d6b5f6 req-101295de-08cb-42cb-aa8e-2ce55055da85 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:48:43 compute-0 nova_compute[183177]: 2026-01-26 19:48:43.357 183181 DEBUG oslo_concurrency.lockutils [req-39e31e79-6ced-4ab4-8f53-22f4a0d6b5f6 req-101295de-08cb-42cb-aa8e-2ce55055da85 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:48:43 compute-0 nova_compute[183177]: 2026-01-26 19:48:43.357 183181 DEBUG oslo_concurrency.lockutils [req-39e31e79-6ced-4ab4-8f53-22f4a0d6b5f6 req-101295de-08cb-42cb-aa8e-2ce55055da85 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:48:43 compute-0 nova_compute[183177]: 2026-01-26 19:48:43.357 183181 DEBUG nova.compute.manager [req-39e31e79-6ced-4ab4-8f53-22f4a0d6b5f6 req-101295de-08cb-42cb-aa8e-2ce55055da85 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] No waiting events found dispatching network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:48:43 compute-0 nova_compute[183177]: 2026-01-26 19:48:43.358 183181 WARNING nova.compute.manager [req-39e31e79-6ced-4ab4-8f53-22f4a0d6b5f6 req-101295de-08cb-42cb-aa8e-2ce55055da85 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received unexpected event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 for instance with vm_state active and task_state None.
Jan 26 19:48:43 compute-0 nova_compute[183177]: 2026-01-26 19:48:43.525 183181 DEBUG oslo_concurrency.lockutils [None req-f9fc16a9-af3b-4497-b1b3-ceff0b179797 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.779s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:48:44 compute-0 nova_compute[183177]: 2026-01-26 19:48:44.755 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:47 compute-0 nova_compute[183177]: 2026-01-26 19:48:47.900 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:49 compute-0 nova_compute[183177]: 2026-01-26 19:48:49.757 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:50 compute-0 nova_compute[183177]: 2026-01-26 19:48:50.741 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:48:51 compute-0 nova_compute[183177]: 2026-01-26 19:48:51.252 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Triggering sync for uuid 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Jan 26 19:48:51 compute-0 nova_compute[183177]: 2026-01-26 19:48:51.254 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:48:51 compute-0 nova_compute[183177]: 2026-01-26 19:48:51.255 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:48:51 compute-0 nova_compute[183177]: 2026-01-26 19:48:51.666 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:48:51 compute-0 nova_compute[183177]: 2026-01-26 19:48:51.766 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.511s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:48:52 compute-0 nova_compute[183177]: 2026-01-26 19:48:52.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:48:52 compute-0 nova_compute[183177]: 2026-01-26 19:48:52.904 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:54 compute-0 nova_compute[183177]: 2026-01-26 19:48:54.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:48:54 compute-0 nova_compute[183177]: 2026-01-26 19:48:54.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:48:54 compute-0 nova_compute[183177]: 2026-01-26 19:48:54.808 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:54 compute-0 ovn_controller[95396]: 2026-01-26T19:48:54Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:15:e5:47 10.100.0.6
Jan 26 19:48:54 compute-0 ovn_controller[95396]: 2026-01-26T19:48:54Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:e5:47 10.100.0.6
Jan 26 19:48:54 compute-0 sshd-session[208846]: error: maximum authentication attempts exceeded for root from 112.119.212.162 port 41134 ssh2 [preauth]
Jan 26 19:48:54 compute-0 sshd-session[208846]: Disconnecting authenticating user root 112.119.212.162 port 41134: Too many authentication failures [preauth]
Jan 26 19:48:55 compute-0 nova_compute[183177]: 2026-01-26 19:48:55.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:48:56 compute-0 nova_compute[183177]: 2026-01-26 19:48:56.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:48:56 compute-0 nova_compute[183177]: 2026-01-26 19:48:56.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:48:56 compute-0 nova_compute[183177]: 2026-01-26 19:48:56.670 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:48:56 compute-0 nova_compute[183177]: 2026-01-26 19:48:56.671 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:48:56 compute-0 nova_compute[183177]: 2026-01-26 19:48:56.672 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:48:56 compute-0 nova_compute[183177]: 2026-01-26 19:48:56.672 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:48:57 compute-0 nova_compute[183177]: 2026-01-26 19:48:57.724 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:48:57 compute-0 nova_compute[183177]: 2026-01-26 19:48:57.803 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:48:57 compute-0 nova_compute[183177]: 2026-01-26 19:48:57.804 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:48:57 compute-0 nova_compute[183177]: 2026-01-26 19:48:57.867 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:48:57 compute-0 nova_compute[183177]: 2026-01-26 19:48:57.906 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:48:58 compute-0 nova_compute[183177]: 2026-01-26 19:48:58.045 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:48:58 compute-0 nova_compute[183177]: 2026-01-26 19:48:58.047 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:48:58 compute-0 nova_compute[183177]: 2026-01-26 19:48:58.064 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:48:58 compute-0 nova_compute[183177]: 2026-01-26 19:48:58.064 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5611MB free_disk=73.06982421875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:48:58 compute-0 nova_compute[183177]: 2026-01-26 19:48:58.065 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:48:58 compute-0 nova_compute[183177]: 2026-01-26 19:48:58.065 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:48:58 compute-0 sshd-session[208863]: error: maximum authentication attempts exceeded for root from 112.119.212.162 port 41804 ssh2 [preauth]
Jan 26 19:48:58 compute-0 sshd-session[208863]: Disconnecting authenticating user root 112.119.212.162 port 41804: Too many authentication failures [preauth]
Jan 26 19:48:59 compute-0 nova_compute[183177]: 2026-01-26 19:48:59.151 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:48:59 compute-0 nova_compute[183177]: 2026-01-26 19:48:59.152 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:48:59 compute-0 nova_compute[183177]: 2026-01-26 19:48:59.152 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:48:58 up  1:13,  0 user,  load average: 0.29, 0.22, 0.35\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_f91153215558476397ac0fa698028694': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:48:59 compute-0 nova_compute[183177]: 2026-01-26 19:48:59.250 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:48:59 compute-0 podman[192499]: time="2026-01-26T19:48:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:48:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:48:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:48:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:48:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Jan 26 19:48:59 compute-0 nova_compute[183177]: 2026-01-26 19:48:59.759 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:48:59 compute-0 nova_compute[183177]: 2026-01-26 19:48:59.843 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:00 compute-0 nova_compute[183177]: 2026-01-26 19:49:00.272 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:49:00 compute-0 nova_compute[183177]: 2026-01-26 19:49:00.273 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.208s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:49:01 compute-0 openstack_network_exporter[195363]: ERROR   19:49:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:49:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:49:01 compute-0 openstack_network_exporter[195363]: ERROR   19:49:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:49:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:49:02 compute-0 sshd-session[208873]: error: maximum authentication attempts exceeded for root from 112.119.212.162 port 42244 ssh2 [preauth]
Jan 26 19:49:02 compute-0 sshd-session[208873]: Disconnecting authenticating user root 112.119.212.162 port 42244: Too many authentication failures [preauth]
Jan 26 19:49:02 compute-0 nova_compute[183177]: 2026-01-26 19:49:02.908 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:03 compute-0 nova_compute[183177]: 2026-01-26 19:49:03.273 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:49:03 compute-0 nova_compute[183177]: 2026-01-26 19:49:03.274 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:49:04 compute-0 nova_compute[183177]: 2026-01-26 19:49:04.889 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:05 compute-0 sshd-session[208875]: Received disconnect from 112.119.212.162 port 42730:11: disconnected by user [preauth]
Jan 26 19:49:05 compute-0 sshd-session[208875]: Disconnected from authenticating user root 112.119.212.162 port 42730 [preauth]
Jan 26 19:49:06 compute-0 podman[208879]: 2026-01-26 19:49:06.403053995 +0000 UTC m=+0.141671651 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, container_name=ovn_controller, org.label-schema.build-date=20260120, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 19:49:07 compute-0 nova_compute[183177]: 2026-01-26 19:49:07.910 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:08.131 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:49:08 compute-0 nova_compute[183177]: 2026-01-26 19:49:08.132 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:08.133 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:49:08 compute-0 podman[208907]: 2026-01-26 19:49:08.340830258 +0000 UTC m=+0.079153979 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 26 19:49:08 compute-0 podman[208906]: 2026-01-26 19:49:08.374472169 +0000 UTC m=+0.117580271 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, version=9.6, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, config_id=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Jan 26 19:49:09 compute-0 sshd-session[208877]: Invalid user admin from 112.119.212.162 port 43246
Jan 26 19:49:09 compute-0 nova_compute[183177]: 2026-01-26 19:49:09.891 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:10 compute-0 sshd-session[208877]: error: maximum authentication attempts exceeded for invalid user admin from 112.119.212.162 port 43246 ssh2 [preauth]
Jan 26 19:49:10 compute-0 sshd-session[208877]: Disconnecting invalid user admin 112.119.212.162 port 43246: Too many authentication failures [preauth]
Jan 26 19:49:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:11.135 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:49:12 compute-0 sshd-session[208943]: Invalid user admin from 112.119.212.162 port 43834
Jan 26 19:49:12 compute-0 nova_compute[183177]: 2026-01-26 19:49:12.912 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:13 compute-0 sshd-session[208943]: error: maximum authentication attempts exceeded for invalid user admin from 112.119.212.162 port 43834 ssh2 [preauth]
Jan 26 19:49:13 compute-0 sshd-session[208943]: Disconnecting invalid user admin 112.119.212.162 port 43834: Too many authentication failures [preauth]
Jan 26 19:49:14 compute-0 podman[208945]: 2026-01-26 19:49:14.35426958 +0000 UTC m=+0.088856144 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:49:14 compute-0 nova_compute[183177]: 2026-01-26 19:49:14.894 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:17 compute-0 sshd-session[208971]: Invalid user admin from 112.119.212.162 port 44318
Jan 26 19:49:17 compute-0 nova_compute[183177]: 2026-01-26 19:49:17.915 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:17 compute-0 sshd-session[208971]: Received disconnect from 112.119.212.162 port 44318:11: disconnected by user [preauth]
Jan 26 19:49:17 compute-0 sshd-session[208971]: Disconnected from invalid user admin 112.119.212.162 port 44318 [preauth]
Jan 26 19:49:19 compute-0 nova_compute[183177]: 2026-01-26 19:49:19.932 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:20 compute-0 sshd-session[208973]: Invalid user oracle from 112.119.212.162 port 44866
Jan 26 19:49:21 compute-0 sshd-session[208973]: error: maximum authentication attempts exceeded for invalid user oracle from 112.119.212.162 port 44866 ssh2 [preauth]
Jan 26 19:49:21 compute-0 sshd-session[208973]: Disconnecting invalid user oracle 112.119.212.162 port 44866: Too many authentication failures [preauth]
Jan 26 19:49:21 compute-0 nova_compute[183177]: 2026-01-26 19:49:21.805 183181 DEBUG nova.compute.manager [None req-fd64c678-ba61-4906-96fb-5e67bfa70d1b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Jan 26 19:49:21 compute-0 nova_compute[183177]: 2026-01-26 19:49:21.891 183181 DEBUG nova.compute.provider_tree [None req-fd64c678-ba61-4906-96fb-5e67bfa70d1b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Updating resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 generation from 17 to 19 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 19:49:22 compute-0 nova_compute[183177]: 2026-01-26 19:49:22.918 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:23 compute-0 sshd-session[208975]: Invalid user oracle from 112.119.212.162 port 45352
Jan 26 19:49:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:24.052 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:49:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:24.052 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:49:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:24.057 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.004s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:49:24 compute-0 nova_compute[183177]: 2026-01-26 19:49:24.936 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:25 compute-0 sshd-session[208975]: error: maximum authentication attempts exceeded for invalid user oracle from 112.119.212.162 port 45352 ssh2 [preauth]
Jan 26 19:49:25 compute-0 sshd-session[208975]: Disconnecting invalid user oracle 112.119.212.162 port 45352: Too many authentication failures [preauth]
Jan 26 19:49:27 compute-0 nova_compute[183177]: 2026-01-26 19:49:27.921 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:29 compute-0 sshd-session[208978]: Invalid user oracle from 112.119.212.162 port 45804
Jan 26 19:49:29 compute-0 podman[192499]: time="2026-01-26T19:49:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:49:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:49:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:49:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:49:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2643 "" "Go-http-client/1.1"
Jan 26 19:49:29 compute-0 sshd-session[208978]: Received disconnect from 112.119.212.162 port 45804:11: disconnected by user [preauth]
Jan 26 19:49:29 compute-0 sshd-session[208978]: Disconnected from invalid user oracle 112.119.212.162 port 45804 [preauth]
Jan 26 19:49:29 compute-0 nova_compute[183177]: 2026-01-26 19:49:29.971 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:31 compute-0 openstack_network_exporter[195363]: ERROR   19:49:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:49:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:49:31 compute-0 openstack_network_exporter[195363]: ERROR   19:49:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:49:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:49:32 compute-0 sshd-session[208980]: Invalid user usuario from 112.119.212.162 port 46414
Jan 26 19:49:32 compute-0 nova_compute[183177]: 2026-01-26 19:49:32.923 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:33 compute-0 sshd-session[208980]: error: maximum authentication attempts exceeded for invalid user usuario from 112.119.212.162 port 46414 ssh2 [preauth]
Jan 26 19:49:33 compute-0 sshd-session[208980]: Disconnecting invalid user usuario 112.119.212.162 port 46414: Too many authentication failures [preauth]
Jan 26 19:49:34 compute-0 nova_compute[183177]: 2026-01-26 19:49:34.989 183181 DEBUG nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Check if temp file /var/lib/nova/instances/tmpdlv17oeb exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 19:49:34 compute-0 nova_compute[183177]: 2026-01-26 19:49:34.996 183181 DEBUG nova.compute.manager [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdlv17oeb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 19:49:35 compute-0 nova_compute[183177]: 2026-01-26 19:49:35.014 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:35 compute-0 sshd-session[208982]: Invalid user usuario from 112.119.212.162 port 46876
Jan 26 19:49:36 compute-0 sshd-session[208982]: error: maximum authentication attempts exceeded for invalid user usuario from 112.119.212.162 port 46876 ssh2 [preauth]
Jan 26 19:49:36 compute-0 sshd-session[208982]: Disconnecting invalid user usuario 112.119.212.162 port 46876: Too many authentication failures [preauth]
Jan 26 19:49:37 compute-0 podman[208984]: 2026-01-26 19:49:37.42149822 +0000 UTC m=+0.164513566 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 19:49:37 compute-0 nova_compute[183177]: 2026-01-26 19:49:37.926 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:37 compute-0 ovn_controller[95396]: 2026-01-26T19:49:37Z|00113|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 19:49:39 compute-0 podman[209013]: 2026-01-26 19:49:39.341766454 +0000 UTC m=+0.073704350 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 19:49:39 compute-0 podman[209012]: 2026-01-26 19:49:39.346788051 +0000 UTC m=+0.087211759 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64)
Jan 26 19:49:39 compute-0 nova_compute[183177]: 2026-01-26 19:49:39.996 183181 DEBUG oslo_concurrency.processutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:49:40 compute-0 nova_compute[183177]: 2026-01-26 19:49:40.016 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:40 compute-0 nova_compute[183177]: 2026-01-26 19:49:40.078 183181 DEBUG oslo_concurrency.processutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:49:40 compute-0 nova_compute[183177]: 2026-01-26 19:49:40.079 183181 DEBUG oslo_concurrency.processutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:49:40 compute-0 nova_compute[183177]: 2026-01-26 19:49:40.149 183181 DEBUG oslo_concurrency.processutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:49:40 compute-0 nova_compute[183177]: 2026-01-26 19:49:40.151 183181 DEBUG nova.compute.manager [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Preparing to wait for external event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:49:40 compute-0 nova_compute[183177]: 2026-01-26 19:49:40.152 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:49:40 compute-0 nova_compute[183177]: 2026-01-26 19:49:40.152 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:49:40 compute-0 nova_compute[183177]: 2026-01-26 19:49:40.153 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:49:40 compute-0 sshd-session[208995]: Invalid user usuario from 112.119.212.162 port 47370
Jan 26 19:49:40 compute-0 sshd-session[208995]: Received disconnect from 112.119.212.162 port 47370:11: disconnected by user [preauth]
Jan 26 19:49:40 compute-0 sshd-session[208995]: Disconnected from invalid user usuario 112.119.212.162 port 47370 [preauth]
Jan 26 19:49:42 compute-0 nova_compute[183177]: 2026-01-26 19:49:42.928 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:43 compute-0 sshd-session[209055]: Invalid user test from 112.119.212.162 port 47818
Jan 26 19:49:44 compute-0 sshd-session[209055]: error: maximum authentication attempts exceeded for invalid user test from 112.119.212.162 port 47818 ssh2 [preauth]
Jan 26 19:49:44 compute-0 sshd-session[209055]: Disconnecting invalid user test 112.119.212.162 port 47818: Too many authentication failures [preauth]
Jan 26 19:49:45 compute-0 nova_compute[183177]: 2026-01-26 19:49:45.018 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:45 compute-0 podman[209059]: 2026-01-26 19:49:45.344934814 +0000 UTC m=+0.080709792 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 19:49:46 compute-0 sshd-session[209057]: Invalid user test from 112.119.212.162 port 48266
Jan 26 19:49:47 compute-0 nova_compute[183177]: 2026-01-26 19:49:47.424 183181 DEBUG nova.compute.manager [req-6428662d-6394-4fa7-a04c-479283d37c42 req-dcf4c25a-73cf-4c91-874c-45072df3767e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-unplugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:49:47 compute-0 nova_compute[183177]: 2026-01-26 19:49:47.424 183181 DEBUG oslo_concurrency.lockutils [req-6428662d-6394-4fa7-a04c-479283d37c42 req-dcf4c25a-73cf-4c91-874c-45072df3767e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:49:47 compute-0 nova_compute[183177]: 2026-01-26 19:49:47.424 183181 DEBUG oslo_concurrency.lockutils [req-6428662d-6394-4fa7-a04c-479283d37c42 req-dcf4c25a-73cf-4c91-874c-45072df3767e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:49:47 compute-0 nova_compute[183177]: 2026-01-26 19:49:47.425 183181 DEBUG oslo_concurrency.lockutils [req-6428662d-6394-4fa7-a04c-479283d37c42 req-dcf4c25a-73cf-4c91-874c-45072df3767e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:49:47 compute-0 nova_compute[183177]: 2026-01-26 19:49:47.425 183181 DEBUG nova.compute.manager [req-6428662d-6394-4fa7-a04c-479283d37c42 req-dcf4c25a-73cf-4c91-874c-45072df3767e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] No event matching network-vif-unplugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 in dict_keys([('network-vif-plugged', 'e510673a-2b92-49f7-83b8-aba8a3f87ee7')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 19:49:47 compute-0 nova_compute[183177]: 2026-01-26 19:49:47.426 183181 DEBUG nova.compute.manager [req-6428662d-6394-4fa7-a04c-479283d37c42 req-dcf4c25a-73cf-4c91-874c-45072df3767e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-unplugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:49:47 compute-0 sshd-session[209057]: error: maximum authentication attempts exceeded for invalid user test from 112.119.212.162 port 48266 ssh2 [preauth]
Jan 26 19:49:47 compute-0 sshd-session[209057]: Disconnecting invalid user test 112.119.212.162 port 48266: Too many authentication failures [preauth]
Jan 26 19:49:47 compute-0 nova_compute[183177]: 2026-01-26 19:49:47.931 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:49 compute-0 nova_compute[183177]: 2026-01-26 19:49:49.532 183181 DEBUG nova.compute.manager [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:49:49 compute-0 nova_compute[183177]: 2026-01-26 19:49:49.533 183181 DEBUG oslo_concurrency.lockutils [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:49:49 compute-0 nova_compute[183177]: 2026-01-26 19:49:49.533 183181 DEBUG oslo_concurrency.lockutils [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:49:49 compute-0 nova_compute[183177]: 2026-01-26 19:49:49.534 183181 DEBUG oslo_concurrency.lockutils [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:49:49 compute-0 nova_compute[183177]: 2026-01-26 19:49:49.534 183181 DEBUG nova.compute.manager [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Processing event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:49:49 compute-0 nova_compute[183177]: 2026-01-26 19:49:49.534 183181 DEBUG nova.compute.manager [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-changed-e510673a-2b92-49f7-83b8-aba8a3f87ee7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:49:49 compute-0 nova_compute[183177]: 2026-01-26 19:49:49.535 183181 DEBUG nova.compute.manager [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Refreshing instance network info cache due to event network-changed-e510673a-2b92-49f7-83b8-aba8a3f87ee7. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:49:49 compute-0 nova_compute[183177]: 2026-01-26 19:49:49.535 183181 DEBUG oslo_concurrency.lockutils [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:49:49 compute-0 nova_compute[183177]: 2026-01-26 19:49:49.536 183181 DEBUG oslo_concurrency.lockutils [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:49:49 compute-0 nova_compute[183177]: 2026-01-26 19:49:49.536 183181 DEBUG nova.network.neutron [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Refreshing network info cache for port e510673a-2b92-49f7-83b8-aba8a3f87ee7 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:49:49 compute-0 nova_compute[183177]: 2026-01-26 19:49:49.691 183181 INFO nova.compute.manager [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Took 9.54 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 26 19:49:49 compute-0 nova_compute[183177]: 2026-01-26 19:49:49.692 183181 DEBUG nova.compute.manager [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:49:50 compute-0 nova_compute[183177]: 2026-01-26 19:49:50.020 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:50 compute-0 nova_compute[183177]: 2026-01-26 19:49:50.044 183181 WARNING neutronclient.v2_0.client [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:49:50 compute-0 nova_compute[183177]: 2026-01-26 19:49:50.208 183181 DEBUG nova.compute.manager [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdlv17oeb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(5bf28b5c-5d20-4ea9-afb8-34104de8ef0f),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 19:49:50 compute-0 nova_compute[183177]: 2026-01-26 19:49:50.721 183181 DEBUG nova.objects.instance [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:49:50 compute-0 nova_compute[183177]: 2026-01-26 19:49:50.723 183181 DEBUG nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 19:49:50 compute-0 nova_compute[183177]: 2026-01-26 19:49:50.728 183181 DEBUG nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 19:49:50 compute-0 nova_compute[183177]: 2026-01-26 19:49:50.729 183181 DEBUG nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 19:49:50 compute-0 sshd-session[209085]: Invalid user test from 112.119.212.162 port 48778
Jan 26 19:49:50 compute-0 nova_compute[183177]: 2026-01-26 19:49:50.951 183181 WARNING neutronclient.v2_0.client [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.232 183181 DEBUG nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.233 183181 DEBUG nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 19:49:51 compute-0 sshd-session[209085]: Received disconnect from 112.119.212.162 port 48778:11: disconnected by user [preauth]
Jan 26 19:49:51 compute-0 sshd-session[209085]: Disconnected from invalid user test 112.119.212.162 port 48778 [preauth]
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.301 183181 DEBUG nova.virt.libvirt.vif [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:48:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1019962861',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1019962861',id=12,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:48:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f91153215558476397ac0fa698028694',ramdisk_id='',reservation_id='r-fji1gb34',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-651747916',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:48:42Z,user_data=None,user_id='41beaf3eb21246ea94c2701984cf4279',uuid=6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.302 183181 DEBUG nova.network.os_vif_util [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.303 183181 DEBUG nova.network.os_vif_util [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:e5:47,bridge_name='br-int',has_traffic_filtering=True,id=e510673a-2b92-49f7-83b8-aba8a3f87ee7,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape510673a-2b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.304 183181 DEBUG nova.virt.libvirt.migration [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <mac address="fa:16:3e:15:e5:47"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <model type="virtio"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <mtu size="1442"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <target dev="tape510673a-2b"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]: </interface>
Jan 26 19:49:51 compute-0 nova_compute[183177]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.305 183181 DEBUG nova.virt.libvirt.migration [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <name>instance-0000000c</name>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <uuid>6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d</uuid>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1019962861</nova:name>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:48:37</nova:creationTime>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:49:51 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:49:51 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:user uuid="41beaf3eb21246ea94c2701984cf4279">tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin</nova:user>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:project uuid="f91153215558476397ac0fa698028694">tempest-TestExecuteHostMaintenanceStrategy-651747916</nova:project>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:port uuid="e510673a-2b92-49f7-83b8-aba8a3f87ee7">
Jan 26 19:49:51 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <system>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="serial">6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="uuid">6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </system>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <os>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </os>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <features>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </features>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk.config"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:15:e5:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape510673a-2b"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/console.log" append="off"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </target>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/console.log" append="off"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </console>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </input>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <video>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </video>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]: </domain>
Jan 26 19:49:51 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.306 183181 DEBUG nova.virt.libvirt.migration [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <name>instance-0000000c</name>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <uuid>6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d</uuid>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1019962861</nova:name>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:48:37</nova:creationTime>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:49:51 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:49:51 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:user uuid="41beaf3eb21246ea94c2701984cf4279">tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin</nova:user>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:project uuid="f91153215558476397ac0fa698028694">tempest-TestExecuteHostMaintenanceStrategy-651747916</nova:project>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:port uuid="e510673a-2b92-49f7-83b8-aba8a3f87ee7">
Jan 26 19:49:51 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <system>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="serial">6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="uuid">6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </system>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <os>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </os>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <features>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </features>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk.config"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:15:e5:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape510673a-2b"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/console.log" append="off"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </target>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/console.log" append="off"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </console>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </input>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <video>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </video>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]: </domain>
Jan 26 19:49:51 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.307 183181 DEBUG nova.virt.libvirt.migration [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <name>instance-0000000c</name>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <uuid>6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d</uuid>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1019962861</nova:name>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:48:37</nova:creationTime>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:49:51 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:49:51 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:user uuid="41beaf3eb21246ea94c2701984cf4279">tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin</nova:user>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:project uuid="f91153215558476397ac0fa698028694">tempest-TestExecuteHostMaintenanceStrategy-651747916</nova:project>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <nova:port uuid="e510673a-2b92-49f7-83b8-aba8a3f87ee7">
Jan 26 19:49:51 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <system>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="serial">6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="uuid">6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </system>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <os>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </os>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <features>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </features>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/disk.config"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:15:e5:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape510673a-2b"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/console.log" append="off"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:49:51 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       </target>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d/console.log" append="off"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </console>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </input>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <video>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </video>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:49:51 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:49:51 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:49:51 compute-0 nova_compute[183177]: </domain>
Jan 26 19:49:51 compute-0 nova_compute[183177]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.308 183181 DEBUG nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.564 183181 DEBUG nova.network.neutron [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Updated VIF entry in instance network info cache for port e510673a-2b92-49f7-83b8-aba8a3f87ee7. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.565 183181 DEBUG nova.network.neutron [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Updating instance_info_cache with network_info: [{"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.735 183181 DEBUG nova.virt.libvirt.migration [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 19:49:51 compute-0 nova_compute[183177]: 2026-01-26 19:49:51.736 183181 INFO nova.virt.libvirt.migration [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 19:49:52 compute-0 nova_compute[183177]: 2026-01-26 19:49:52.072 183181 DEBUG oslo_concurrency.lockutils [req-16ad452b-ccfa-4f6a-833f-9f6197acca32 req-e5668050-1fef-46b8-bb70-cf1c5b11866b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:49:52 compute-0 nova_compute[183177]: 2026-01-26 19:49:52.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:49:52 compute-0 nova_compute[183177]: 2026-01-26 19:49:52.757 183181 INFO nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 19:49:52 compute-0 nova_compute[183177]: 2026-01-26 19:49:52.934 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.263 183181 DEBUG nova.virt.libvirt.migration [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.263 183181 DEBUG nova.virt.libvirt.migration [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 19:49:53 compute-0 kernel: tape510673a-2b (unregistering): left promiscuous mode
Jan 26 19:49:53 compute-0 NetworkManager[55489]: <info>  [1769456993.3445] device (tape510673a-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:49:53 compute-0 ovn_controller[95396]: 2026-01-26T19:49:53Z|00114|binding|INFO|Releasing lport e510673a-2b92-49f7-83b8-aba8a3f87ee7 from this chassis (sb_readonly=0)
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.363 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:53 compute-0 ovn_controller[95396]: 2026-01-26T19:49:53Z|00115|binding|INFO|Setting lport e510673a-2b92-49f7-83b8-aba8a3f87ee7 down in Southbound
Jan 26 19:49:53 compute-0 ovn_controller[95396]: 2026-01-26T19:49:53Z|00116|binding|INFO|Removing iface tape510673a-2b ovn-installed in OVS
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.366 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.400 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:53 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 26 19:49:53 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Consumed 16.599s CPU time.
Jan 26 19:49:53 compute-0 systemd-machined[154465]: Machine qemu-9-instance-0000000c terminated.
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.481 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:e5:47 10.100.0.6'], port_security=['fa:16:3e:15:e5:47 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c86a094a-ada8-46ad-9c23-c857e9a7b834'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f91153215558476397ac0fa698028694', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'dcc3a434-5bc5-4cb2-8878-ad4556ff41ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bd82498-4e86-414e-9a6d-c217ab314723, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=e510673a-2b92-49f7-83b8-aba8a3f87ee7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.482 104672 INFO neutron.agent.ovn.metadata.agent [-] Port e510673a-2b92-49f7-83b8-aba8a3f87ee7 in datapath 147aa3ea-66ec-4250-9408-de2c9a19f4fa unbound from our chassis
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.484 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 147aa3ea-66ec-4250-9408-de2c9a19f4fa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.487 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[963d3ef9-efbb-4c0f-8e6d-5ac32e3e7fef]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.487 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa namespace which is not needed anymore
Jan 26 19:49:53 compute-0 sshd-session[209088]: Invalid user user from 112.119.212.162 port 49240
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.609 183181 DEBUG nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.609 183181 DEBUG nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.610 183181 DEBUG nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 19:49:53 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[208806]: [NOTICE]   (208810) : haproxy version is 3.0.5-8e879a5
Jan 26 19:49:53 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[208806]: [NOTICE]   (208810) : path to executable is /usr/sbin/haproxy
Jan 26 19:49:53 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[208806]: [WARNING]  (208810) : Exiting Master process...
Jan 26 19:49:53 compute-0 podman[209149]: 2026-01-26 19:49:53.658264458 +0000 UTC m=+0.035340050 container kill db27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Jan 26 19:49:53 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[208806]: [ALERT]    (208810) : Current worker (208812) exited with code 143 (Terminated)
Jan 26 19:49:53 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[208806]: [WARNING]  (208810) : All workers exited. Exiting... (0)
Jan 26 19:49:53 compute-0 systemd[1]: libpod-db27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a.scope: Deactivated successfully.
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.716 183181 DEBUG nova.compute.manager [req-5db99f1c-bc74-460d-bec7-3759cc54908b req-b6d42a73-3e49-4d4f-89de-1b9a5974a71c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-unplugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.716 183181 DEBUG oslo_concurrency.lockutils [req-5db99f1c-bc74-460d-bec7-3759cc54908b req-b6d42a73-3e49-4d4f-89de-1b9a5974a71c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.717 183181 DEBUG oslo_concurrency.lockutils [req-5db99f1c-bc74-460d-bec7-3759cc54908b req-b6d42a73-3e49-4d4f-89de-1b9a5974a71c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.717 183181 DEBUG oslo_concurrency.lockutils [req-5db99f1c-bc74-460d-bec7-3759cc54908b req-b6d42a73-3e49-4d4f-89de-1b9a5974a71c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.717 183181 DEBUG nova.compute.manager [req-5db99f1c-bc74-460d-bec7-3759cc54908b req-b6d42a73-3e49-4d4f-89de-1b9a5974a71c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] No waiting events found dispatching network-vif-unplugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.717 183181 DEBUG nova.compute.manager [req-5db99f1c-bc74-460d-bec7-3759cc54908b req-b6d42a73-3e49-4d4f-89de-1b9a5974a71c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-unplugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:49:53 compute-0 podman[209165]: 2026-01-26 19:49:53.721548251 +0000 UTC m=+0.039899665 container died db27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 19:49:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a-userdata-shm.mount: Deactivated successfully.
Jan 26 19:49:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-02b823549fbb3030cad8d1f38c786e4dfe427f080cc81246ff9ca55108e2d64a-merged.mount: Deactivated successfully.
Jan 26 19:49:53 compute-0 podman[209165]: 2026-01-26 19:49:53.762714728 +0000 UTC m=+0.081066082 container cleanup db27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120)
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.767 183181 DEBUG nova.virt.libvirt.guest [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d' (instance-0000000c) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.768 183181 INFO nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Migration operation has completed
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.768 183181 INFO nova.compute.manager [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] _post_live_migration() is started..
Jan 26 19:49:53 compute-0 systemd[1]: libpod-conmon-db27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a.scope: Deactivated successfully.
Jan 26 19:49:53 compute-0 podman[209167]: 2026-01-26 19:49:53.782877891 +0000 UTC m=+0.088857156 container remove db27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.784 183181 WARNING neutronclient.v2_0.client [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.785 183181 WARNING neutronclient.v2_0.client [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.802 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[dd08ccc3-23bb-4e7b-b673-683a86a8eb4f]: (4, ("Mon Jan 26 07:49:53 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa (db27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a)\ndb27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a\nMon Jan 26 07:49:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa (db27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a)\ndb27012dcb8f49809f4ab07c79d5ef8818f42f3aea606fdfcda1cc9dce1c7c5a\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.804 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1a37ae90-3b7f-4022-a5d3-cf6317a463e9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.805 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.806 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b15af1c7-e9c0-4ca2-8870-56cfff469c45]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.806 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap147aa3ea-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.809 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:53 compute-0 kernel: tap147aa3ea-60: left promiscuous mode
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.840 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.841 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d472daa0-3b63-4784-ace8-d48a9188e8f0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.859 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d9bd7c96-06b5-46fe-a51d-270208ae7494]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.861 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[566ce988-e4b8-4a4d-9e4f-87e22dd5ec1c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:49:53 compute-0 nova_compute[183177]: 2026-01-26 19:49:53.871 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.871 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.882 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[25884875-3353-4480-935f-19430d6190e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438003, 'reachable_time': 44596, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209201, 'error': None, 'target': 'ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.886 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.886 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[b0421641-4919-4a0c-b1ee-709de30ba93f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:49:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d147aa3ea\x2d66ec\x2d4250\x2d9408\x2dde2c9a19f4fa.mount: Deactivated successfully.
Jan 26 19:49:53 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:49:53.887 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.158 183181 DEBUG nova.compute.manager [req-00c5f3f6-acf2-449f-9f0f-c051e934a246 req-a796a43e-4936-4a68-b734-56072d4eb8f8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-unplugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.159 183181 DEBUG oslo_concurrency.lockutils [req-00c5f3f6-acf2-449f-9f0f-c051e934a246 req-a796a43e-4936-4a68-b734-56072d4eb8f8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.160 183181 DEBUG oslo_concurrency.lockutils [req-00c5f3f6-acf2-449f-9f0f-c051e934a246 req-a796a43e-4936-4a68-b734-56072d4eb8f8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.160 183181 DEBUG oslo_concurrency.lockutils [req-00c5f3f6-acf2-449f-9f0f-c051e934a246 req-a796a43e-4936-4a68-b734-56072d4eb8f8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.161 183181 DEBUG nova.compute.manager [req-00c5f3f6-acf2-449f-9f0f-c051e934a246 req-a796a43e-4936-4a68-b734-56072d4eb8f8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] No waiting events found dispatching network-vif-unplugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.161 183181 DEBUG nova.compute.manager [req-00c5f3f6-acf2-449f-9f0f-c051e934a246 req-a796a43e-4936-4a68-b734-56072d4eb8f8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-unplugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.613 183181 DEBUG nova.network.neutron [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Activated binding for port e510673a-2b92-49f7-83b8-aba8a3f87ee7 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.615 183181 DEBUG nova.compute.manager [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.617 183181 DEBUG nova.virt.libvirt.vif [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:48:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1019962861',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1019962861',id=12,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:48:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f91153215558476397ac0fa698028694',ramdisk_id='',reservation_id='r-fji1gb34',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-651747916',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:49:27Z,user_data=None,user_id='41beaf3eb21246ea94c2701984cf4279',uuid=6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.617 183181 DEBUG nova.network.os_vif_util [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "address": "fa:16:3e:15:e5:47", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape510673a-2b", "ovs_interfaceid": "e510673a-2b92-49f7-83b8-aba8a3f87ee7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.619 183181 DEBUG nova.network.os_vif_util [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:e5:47,bridge_name='br-int',has_traffic_filtering=True,id=e510673a-2b92-49f7-83b8-aba8a3f87ee7,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape510673a-2b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.619 183181 DEBUG os_vif [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:e5:47,bridge_name='br-int',has_traffic_filtering=True,id=e510673a-2b92-49f7-83b8-aba8a3f87ee7,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape510673a-2b') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.622 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.622 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape510673a-2b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.624 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.626 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.627 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.628 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0ac5e89a-a5d3-453a-a728-52b637227990) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.629 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:54 compute-0 sshd-session[209088]: error: maximum authentication attempts exceeded for invalid user user from 112.119.212.162 port 49240 ssh2 [preauth]
Jan 26 19:49:54 compute-0 sshd-session[209088]: Disconnecting invalid user user 112.119.212.162 port 49240: Too many authentication failures [preauth]
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.630 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.635 183181 INFO os_vif [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:e5:47,bridge_name='br-int',has_traffic_filtering=True,id=e510673a-2b92-49f7-83b8-aba8a3f87ee7,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape510673a-2b')
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.636 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.636 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.637 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.637 183181 DEBUG nova.compute.manager [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.637 183181 INFO nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Deleting instance files /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d_del
Jan 26 19:49:54 compute-0 nova_compute[183177]: 2026-01-26 19:49:54.639 183181 INFO nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Deletion of /var/lib/nova/instances/6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d_del complete
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.053 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.839 183181 DEBUG nova.compute.manager [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.839 183181 DEBUG oslo_concurrency.lockutils [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.840 183181 DEBUG oslo_concurrency.lockutils [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.841 183181 DEBUG oslo_concurrency.lockutils [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.842 183181 DEBUG nova.compute.manager [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] No waiting events found dispatching network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.842 183181 WARNING nova.compute.manager [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received unexpected event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 for instance with vm_state active and task_state migrating.
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.843 183181 DEBUG nova.compute.manager [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-unplugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.843 183181 DEBUG oslo_concurrency.lockutils [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.844 183181 DEBUG oslo_concurrency.lockutils [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.845 183181 DEBUG oslo_concurrency.lockutils [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.845 183181 DEBUG nova.compute.manager [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] No waiting events found dispatching network-vif-unplugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.846 183181 DEBUG nova.compute.manager [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-unplugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.846 183181 DEBUG nova.compute.manager [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.847 183181 DEBUG oslo_concurrency.lockutils [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.848 183181 DEBUG oslo_concurrency.lockutils [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.848 183181 DEBUG oslo_concurrency.lockutils [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.849 183181 DEBUG nova.compute.manager [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] No waiting events found dispatching network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.850 183181 WARNING nova.compute.manager [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received unexpected event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 for instance with vm_state active and task_state migrating.
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.850 183181 DEBUG nova.compute.manager [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.851 183181 DEBUG oslo_concurrency.lockutils [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.851 183181 DEBUG oslo_concurrency.lockutils [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.852 183181 DEBUG oslo_concurrency.lockutils [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.853 183181 DEBUG nova.compute.manager [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] No waiting events found dispatching network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:49:55 compute-0 nova_compute[183177]: 2026-01-26 19:49:55.853 183181 WARNING nova.compute.manager [req-e6aea10b-7569-46d5-afd6-c4950fe335c9 req-f29c4875-c3ff-44e6-b073-7b23c80587f9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Received unexpected event network-vif-plugged-e510673a-2b92-49f7-83b8-aba8a3f87ee7 for instance with vm_state active and task_state migrating.
Jan 26 19:49:56 compute-0 sshd-session[209204]: Invalid user user from 112.119.212.162 port 49684
Jan 26 19:49:57 compute-0 nova_compute[183177]: 2026-01-26 19:49:57.150 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:49:57 compute-0 sshd-session[209204]: error: maximum authentication attempts exceeded for invalid user user from 112.119.212.162 port 49684 ssh2 [preauth]
Jan 26 19:49:57 compute-0 sshd-session[209204]: Disconnecting invalid user user 112.119.212.162 port 49684: Too many authentication failures [preauth]
Jan 26 19:49:58 compute-0 nova_compute[183177]: 2026-01-26 19:49:58.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:49:58 compute-0 nova_compute[183177]: 2026-01-26 19:49:58.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:49:58 compute-0 nova_compute[183177]: 2026-01-26 19:49:58.667 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:49:58 compute-0 nova_compute[183177]: 2026-01-26 19:49:58.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:49:58 compute-0 nova_compute[183177]: 2026-01-26 19:49:58.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:49:58 compute-0 nova_compute[183177]: 2026-01-26 19:49:58.670 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:49:58 compute-0 nova_compute[183177]: 2026-01-26 19:49:58.918 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:49:58 compute-0 nova_compute[183177]: 2026-01-26 19:49:58.920 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:49:58 compute-0 nova_compute[183177]: 2026-01-26 19:49:58.949 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:49:58 compute-0 nova_compute[183177]: 2026-01-26 19:49:58.950 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5799MB free_disk=73.09858703613281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:49:58 compute-0 nova_compute[183177]: 2026-01-26 19:49:58.951 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:49:58 compute-0 nova_compute[183177]: 2026-01-26 19:49:58.951 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:49:59 compute-0 nova_compute[183177]: 2026-01-26 19:49:59.630 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:49:59 compute-0 podman[192499]: time="2026-01-26T19:49:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:49:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:49:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:49:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:49:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 26 19:49:59 compute-0 nova_compute[183177]: 2026-01-26 19:49:59.978 183181 INFO nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Updating resource usage from migration 5bf28b5c-5d20-4ea9-afb8-34104de8ef0f
Jan 26 19:50:00 compute-0 nova_compute[183177]: 2026-01-26 19:50:00.056 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:00 compute-0 nova_compute[183177]: 2026-01-26 19:50:00.104 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Migration 5bf28b5c-5d20-4ea9-afb8-34104de8ef0f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 19:50:00 compute-0 nova_compute[183177]: 2026-01-26 19:50:00.104 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:50:00 compute-0 nova_compute[183177]: 2026-01-26 19:50:00.104 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:49:58 up  1:14,  0 user,  load average: 0.14, 0.20, 0.34\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_f91153215558476397ac0fa698028694': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:50:00 compute-0 sshd-session[209206]: Invalid user user from 112.119.212.162 port 50098
Jan 26 19:50:00 compute-0 nova_compute[183177]: 2026-01-26 19:50:00.173 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:50:00 compute-0 nova_compute[183177]: 2026-01-26 19:50:00.678 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:50:01 compute-0 sshd-session[209206]: Received disconnect from 112.119.212.162 port 50098:11: disconnected by user [preauth]
Jan 26 19:50:01 compute-0 sshd-session[209206]: Disconnected from invalid user user 112.119.212.162 port 50098 [preauth]
Jan 26 19:50:01 compute-0 nova_compute[183177]: 2026-01-26 19:50:01.189 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:50:01 compute-0 nova_compute[183177]: 2026-01-26 19:50:01.190 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.238s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:01 compute-0 openstack_network_exporter[195363]: ERROR   19:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:50:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:50:01 compute-0 openstack_network_exporter[195363]: ERROR   19:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:50:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:50:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:02.888 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:50:03 compute-0 nova_compute[183177]: 2026-01-26 19:50:03.190 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:50:03 compute-0 nova_compute[183177]: 2026-01-26 19:50:03.192 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:50:03 compute-0 nova_compute[183177]: 2026-01-26 19:50:03.679 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:50:03 compute-0 nova_compute[183177]: 2026-01-26 19:50:03.679 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:50:03 compute-0 nova_compute[183177]: 2026-01-26 19:50:03.680 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:03 compute-0 sshd-session[209210]: Invalid user ftpuser from 112.119.212.162 port 50594
Jan 26 19:50:04 compute-0 sshd-session[209212]: Invalid user hduser from 193.32.162.151 port 43642
Jan 26 19:50:04 compute-0 nova_compute[183177]: 2026-01-26 19:50:04.191 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:50:04 compute-0 nova_compute[183177]: 2026-01-26 19:50:04.191 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:50:04 compute-0 nova_compute[183177]: 2026-01-26 19:50:04.192 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:04 compute-0 nova_compute[183177]: 2026-01-26 19:50:04.192 183181 DEBUG nova.compute.resource_tracker [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:50:04 compute-0 sshd-session[209212]: Connection closed by invalid user hduser 193.32.162.151 port 43642 [preauth]
Jan 26 19:50:04 compute-0 nova_compute[183177]: 2026-01-26 19:50:04.318 183181 WARNING nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:50:04 compute-0 nova_compute[183177]: 2026-01-26 19:50:04.318 183181 DEBUG oslo_concurrency.processutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:50:04 compute-0 nova_compute[183177]: 2026-01-26 19:50:04.354 183181 DEBUG oslo_concurrency.processutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:50:04 compute-0 nova_compute[183177]: 2026-01-26 19:50:04.356 183181 DEBUG nova.compute.resource_tracker [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5801MB free_disk=73.09856414794922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:50:04 compute-0 nova_compute[183177]: 2026-01-26 19:50:04.356 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:50:04 compute-0 nova_compute[183177]: 2026-01-26 19:50:04.357 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:50:04 compute-0 nova_compute[183177]: 2026-01-26 19:50:04.632 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:04 compute-0 sshd-session[209210]: error: maximum authentication attempts exceeded for invalid user ftpuser from 112.119.212.162 port 50594 ssh2 [preauth]
Jan 26 19:50:04 compute-0 sshd-session[209210]: Disconnecting invalid user ftpuser 112.119.212.162 port 50594: Too many authentication failures [preauth]
Jan 26 19:50:05 compute-0 nova_compute[183177]: 2026-01-26 19:50:05.094 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:05 compute-0 nova_compute[183177]: 2026-01-26 19:50:05.150 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:50:05 compute-0 nova_compute[183177]: 2026-01-26 19:50:05.379 183181 DEBUG nova.compute.resource_tracker [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 19:50:05 compute-0 nova_compute[183177]: 2026-01-26 19:50:05.888 183181 DEBUG nova.compute.resource_tracker [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 19:50:05 compute-0 nova_compute[183177]: 2026-01-26 19:50:05.921 183181 DEBUG nova.compute.resource_tracker [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration 5bf28b5c-5d20-4ea9-afb8-34104de8ef0f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 19:50:05 compute-0 nova_compute[183177]: 2026-01-26 19:50:05.921 183181 DEBUG nova.compute.resource_tracker [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:50:05 compute-0 nova_compute[183177]: 2026-01-26 19:50:05.922 183181 DEBUG nova.compute.resource_tracker [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:50:04 up  1:14,  0 user,  load average: 0.13, 0.19, 0.33\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:50:05 compute-0 nova_compute[183177]: 2026-01-26 19:50:05.960 183181 DEBUG nova.compute.provider_tree [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:50:06 compute-0 nova_compute[183177]: 2026-01-26 19:50:06.468 183181 DEBUG nova.scheduler.client.report [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:50:06 compute-0 nova_compute[183177]: 2026-01-26 19:50:06.980 183181 DEBUG nova.compute.resource_tracker [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:50:06 compute-0 nova_compute[183177]: 2026-01-26 19:50:06.981 183181 DEBUG oslo_concurrency.lockutils [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.624s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:07 compute-0 sshd-session[209215]: Invalid user ftpuser from 112.119.212.162 port 51124
Jan 26 19:50:07 compute-0 podman[209217]: 2026-01-26 19:50:07.626941334 +0000 UTC m=+0.134891955 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4)
Jan 26 19:50:08 compute-0 nova_compute[183177]: 2026-01-26 19:50:08.139 183181 INFO nova.compute.manager [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 26 19:50:08 compute-0 sshd-session[209215]: error: maximum authentication attempts exceeded for invalid user ftpuser from 112.119.212.162 port 51124 ssh2 [preauth]
Jan 26 19:50:08 compute-0 sshd-session[209215]: Disconnecting invalid user ftpuser 112.119.212.162 port 51124: Too many authentication failures [preauth]
Jan 26 19:50:09 compute-0 nova_compute[183177]: 2026-01-26 19:50:09.392 183181 INFO nova.scheduler.client.report [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration 5bf28b5c-5d20-4ea9-afb8-34104de8ef0f
Jan 26 19:50:09 compute-0 nova_compute[183177]: 2026-01-26 19:50:09.393 183181 DEBUG nova.virt.libvirt.driver [None req-4aaafdb8-8765-4f9c-b9e3-ff6c3d26fa92 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6922f9dc-b0b7-4bf9-995a-48ed6bc48a8d] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 19:50:09 compute-0 nova_compute[183177]: 2026-01-26 19:50:09.636 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:10 compute-0 nova_compute[183177]: 2026-01-26 19:50:10.096 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:10 compute-0 podman[209248]: 2026-01-26 19:50:10.333205237 +0000 UTC m=+0.075467093 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:50:10 compute-0 podman[209247]: 2026-01-26 19:50:10.359768342 +0000 UTC m=+0.104401432 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public, vendor=Red Hat, Inc., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 19:50:11 compute-0 sshd-session[209245]: Invalid user ftpuser from 112.119.212.162 port 51594
Jan 26 19:50:12 compute-0 sshd-session[209245]: Received disconnect from 112.119.212.162 port 51594:11: disconnected by user [preauth]
Jan 26 19:50:12 compute-0 sshd-session[209245]: Disconnected from invalid user ftpuser 112.119.212.162 port 51594 [preauth]
Jan 26 19:50:14 compute-0 nova_compute[183177]: 2026-01-26 19:50:14.639 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:14 compute-0 sshd-session[209285]: Invalid user test1 from 112.119.212.162 port 52100
Jan 26 19:50:15 compute-0 nova_compute[183177]: 2026-01-26 19:50:15.099 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:15 compute-0 sshd-session[209285]: error: maximum authentication attempts exceeded for invalid user test1 from 112.119.212.162 port 52100 ssh2 [preauth]
Jan 26 19:50:15 compute-0 sshd-session[209285]: Disconnecting invalid user test1 112.119.212.162 port 52100: Too many authentication failures [preauth]
Jan 26 19:50:16 compute-0 podman[209287]: 2026-01-26 19:50:16.345359677 +0000 UTC m=+0.089092090 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:50:19 compute-0 sshd-session[209310]: Invalid user test1 from 112.119.212.162 port 52650
Jan 26 19:50:19 compute-0 nova_compute[183177]: 2026-01-26 19:50:19.640 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:20 compute-0 nova_compute[183177]: 2026-01-26 19:50:20.101 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:20 compute-0 sshd-session[209310]: error: maximum authentication attempts exceeded for invalid user test1 from 112.119.212.162 port 52650 ssh2 [preauth]
Jan 26 19:50:20 compute-0 sshd-session[209310]: Disconnecting invalid user test1 112.119.212.162 port 52650: Too many authentication failures [preauth]
Jan 26 19:50:23 compute-0 sshd-session[209312]: Invalid user test1 from 112.119.212.162 port 53156
Jan 26 19:50:23 compute-0 sshd-session[209312]: Received disconnect from 112.119.212.162 port 53156:11: disconnected by user [preauth]
Jan 26 19:50:23 compute-0 sshd-session[209312]: Disconnected from invalid user test1 112.119.212.162 port 53156 [preauth]
Jan 26 19:50:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:24.058 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:50:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:24.059 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:50:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:24.059 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:24 compute-0 nova_compute[183177]: 2026-01-26 19:50:24.642 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:25 compute-0 nova_compute[183177]: 2026-01-26 19:50:25.105 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:26 compute-0 sshd-session[209314]: Invalid user test2 from 112.119.212.162 port 53558
Jan 26 19:50:28 compute-0 sshd-session[209314]: error: maximum authentication attempts exceeded for invalid user test2 from 112.119.212.162 port 53558 ssh2 [preauth]
Jan 26 19:50:28 compute-0 sshd-session[209314]: Disconnecting invalid user test2 112.119.212.162 port 53558: Too many authentication failures [preauth]
Jan 26 19:50:28 compute-0 nova_compute[183177]: 2026-01-26 19:50:28.610 183181 DEBUG nova.compute.manager [None req-558f732f-7820-4c61-95d0-0135f1275666 a22c940165ed49a990de4c1cf7e61838 bbc26b645f2a4d108c00608f11fdebb2 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Jan 26 19:50:28 compute-0 nova_compute[183177]: 2026-01-26 19:50:28.683 183181 DEBUG nova.compute.provider_tree [None req-558f732f-7820-4c61-95d0-0135f1275666 a22c940165ed49a990de4c1cf7e61838 bbc26b645f2a4d108c00608f11fdebb2 - - default default] Updating resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 generation from 19 to 22 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 19:50:29 compute-0 nova_compute[183177]: 2026-01-26 19:50:29.646 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:30 compute-0 podman[192499]: time="2026-01-26T19:50:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:50:30 compute-0 podman[192499]: @ - - [26/Jan/2026:19:50:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:50:30 compute-0 podman[192499]: @ - - [26/Jan/2026:19:50:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Jan 26 19:50:30 compute-0 nova_compute[183177]: 2026-01-26 19:50:30.106 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:30 compute-0 sshd-session[209317]: Invalid user test2 from 112.119.212.162 port 54206
Jan 26 19:50:31 compute-0 openstack_network_exporter[195363]: ERROR   19:50:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:50:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:50:31 compute-0 openstack_network_exporter[195363]: ERROR   19:50:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:50:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:50:31 compute-0 sshd-session[209317]: error: maximum authentication attempts exceeded for invalid user test2 from 112.119.212.162 port 54206 ssh2 [preauth]
Jan 26 19:50:31 compute-0 sshd-session[209317]: Disconnecting invalid user test2 112.119.212.162 port 54206: Too many authentication failures [preauth]
Jan 26 19:50:31 compute-0 nova_compute[183177]: 2026-01-26 19:50:31.935 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:50:31 compute-0 nova_compute[183177]: 2026-01-26 19:50:31.936 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:50:32 compute-0 nova_compute[183177]: 2026-01-26 19:50:32.442 183181 DEBUG nova.compute.manager [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 19:50:32 compute-0 nova_compute[183177]: 2026-01-26 19:50:32.988 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:50:32 compute-0 nova_compute[183177]: 2026-01-26 19:50:32.988 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:50:32 compute-0 nova_compute[183177]: 2026-01-26 19:50:32.995 183181 DEBUG nova.virt.hardware [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 19:50:32 compute-0 nova_compute[183177]: 2026-01-26 19:50:32.996 183181 INFO nova.compute.claims [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Claim successful on node compute-0.ctlplane.example.com
Jan 26 19:50:34 compute-0 nova_compute[183177]: 2026-01-26 19:50:34.063 183181 DEBUG nova.compute.provider_tree [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:50:34 compute-0 sshd-session[209319]: Invalid user test2 from 112.119.212.162 port 54666
Jan 26 19:50:34 compute-0 nova_compute[183177]: 2026-01-26 19:50:34.572 183181 DEBUG nova.scheduler.client.report [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:50:34 compute-0 nova_compute[183177]: 2026-01-26 19:50:34.649 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:34 compute-0 sshd-session[209319]: Received disconnect from 112.119.212.162 port 54666:11: disconnected by user [preauth]
Jan 26 19:50:34 compute-0 sshd-session[209319]: Disconnected from invalid user test2 112.119.212.162 port 54666 [preauth]
Jan 26 19:50:35 compute-0 nova_compute[183177]: 2026-01-26 19:50:35.081 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:35 compute-0 nova_compute[183177]: 2026-01-26 19:50:35.083 183181 DEBUG nova.compute.manager [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 19:50:35 compute-0 nova_compute[183177]: 2026-01-26 19:50:35.107 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:35 compute-0 nova_compute[183177]: 2026-01-26 19:50:35.598 183181 DEBUG nova.compute.manager [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 19:50:35 compute-0 nova_compute[183177]: 2026-01-26 19:50:35.599 183181 DEBUG nova.network.neutron [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 19:50:35 compute-0 nova_compute[183177]: 2026-01-26 19:50:35.600 183181 WARNING neutronclient.v2_0.client [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:50:35 compute-0 nova_compute[183177]: 2026-01-26 19:50:35.600 183181 WARNING neutronclient.v2_0.client [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:50:36 compute-0 nova_compute[183177]: 2026-01-26 19:50:36.113 183181 INFO nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 19:50:36 compute-0 nova_compute[183177]: 2026-01-26 19:50:36.627 183181 DEBUG nova.compute.manager [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.235 183181 DEBUG nova.network.neutron [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Successfully created port: 77fc680a-e843-40d6-8230-7edc80f67312 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 19:50:37 compute-0 sshd-session[209321]: Invalid user ubuntu from 112.119.212.162 port 55062
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.650 183181 DEBUG nova.compute.manager [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.652 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.653 183181 INFO nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Creating image(s)
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.655 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.656 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.657 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.658 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.665 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.669 183181 DEBUG oslo_concurrency.processutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.756 183181 DEBUG oslo_concurrency.processutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.758 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.759 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.760 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.766 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.768 183181 DEBUG oslo_concurrency.processutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.852 183181 DEBUG oslo_concurrency.processutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.853 183181 DEBUG oslo_concurrency.processutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.898 183181 DEBUG oslo_concurrency.processutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.899 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.900 183181 DEBUG oslo_concurrency.processutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.963 183181 DEBUG oslo_concurrency.processutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.965 183181 DEBUG nova.virt.disk.api [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Checking if we can resize image /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:50:37 compute-0 nova_compute[183177]: 2026-01-26 19:50:37.966 183181 DEBUG oslo_concurrency.processutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:50:38 compute-0 nova_compute[183177]: 2026-01-26 19:50:38.043 183181 DEBUG oslo_concurrency.processutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:50:38 compute-0 nova_compute[183177]: 2026-01-26 19:50:38.046 183181 DEBUG nova.virt.disk.api [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Cannot resize image /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:50:38 compute-0 nova_compute[183177]: 2026-01-26 19:50:38.047 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 19:50:38 compute-0 nova_compute[183177]: 2026-01-26 19:50:38.048 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Ensure instance console log exists: /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 19:50:38 compute-0 nova_compute[183177]: 2026-01-26 19:50:38.049 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:50:38 compute-0 nova_compute[183177]: 2026-01-26 19:50:38.050 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:50:38 compute-0 nova_compute[183177]: 2026-01-26 19:50:38.050 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:38 compute-0 podman[209338]: 2026-01-26 19:50:38.387489766 +0000 UTC m=+0.127799883 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible)
Jan 26 19:50:38 compute-0 sshd-session[209321]: error: maximum authentication attempts exceeded for invalid user ubuntu from 112.119.212.162 port 55062 ssh2 [preauth]
Jan 26 19:50:38 compute-0 sshd-session[209321]: Disconnecting invalid user ubuntu 112.119.212.162 port 55062: Too many authentication failures [preauth]
Jan 26 19:50:38 compute-0 nova_compute[183177]: 2026-01-26 19:50:38.677 183181 DEBUG nova.network.neutron [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Successfully updated port: 77fc680a-e843-40d6-8230-7edc80f67312 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 19:50:38 compute-0 nova_compute[183177]: 2026-01-26 19:50:38.730 183181 DEBUG nova.compute.manager [req-15feea84-9d04-4dff-9c62-ddae7136901a req-beacf1e0-c314-4739-90d4-5263fba7007d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-changed-77fc680a-e843-40d6-8230-7edc80f67312 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:50:38 compute-0 nova_compute[183177]: 2026-01-26 19:50:38.730 183181 DEBUG nova.compute.manager [req-15feea84-9d04-4dff-9c62-ddae7136901a req-beacf1e0-c314-4739-90d4-5263fba7007d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Refreshing instance network info cache due to event network-changed-77fc680a-e843-40d6-8230-7edc80f67312. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:50:38 compute-0 nova_compute[183177]: 2026-01-26 19:50:38.731 183181 DEBUG oslo_concurrency.lockutils [req-15feea84-9d04-4dff-9c62-ddae7136901a req-beacf1e0-c314-4739-90d4-5263fba7007d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-72f54dad-db6d-4869-ab9d-ff6464876dc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:50:38 compute-0 nova_compute[183177]: 2026-01-26 19:50:38.731 183181 DEBUG oslo_concurrency.lockutils [req-15feea84-9d04-4dff-9c62-ddae7136901a req-beacf1e0-c314-4739-90d4-5263fba7007d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-72f54dad-db6d-4869-ab9d-ff6464876dc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:50:38 compute-0 nova_compute[183177]: 2026-01-26 19:50:38.731 183181 DEBUG nova.network.neutron [req-15feea84-9d04-4dff-9c62-ddae7136901a req-beacf1e0-c314-4739-90d4-5263fba7007d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Refreshing network info cache for port 77fc680a-e843-40d6-8230-7edc80f67312 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:50:39 compute-0 nova_compute[183177]: 2026-01-26 19:50:39.186 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "refresh_cache-72f54dad-db6d-4869-ab9d-ff6464876dc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:50:39 compute-0 nova_compute[183177]: 2026-01-26 19:50:39.238 183181 WARNING neutronclient.v2_0.client [req-15feea84-9d04-4dff-9c62-ddae7136901a req-beacf1e0-c314-4739-90d4-5263fba7007d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:50:39 compute-0 nova_compute[183177]: 2026-01-26 19:50:39.541 183181 DEBUG nova.network.neutron [req-15feea84-9d04-4dff-9c62-ddae7136901a req-beacf1e0-c314-4739-90d4-5263fba7007d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:50:39 compute-0 nova_compute[183177]: 2026-01-26 19:50:39.686 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:39 compute-0 nova_compute[183177]: 2026-01-26 19:50:39.765 183181 DEBUG nova.network.neutron [req-15feea84-9d04-4dff-9c62-ddae7136901a req-beacf1e0-c314-4739-90d4-5263fba7007d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:50:40 compute-0 nova_compute[183177]: 2026-01-26 19:50:40.109 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:40 compute-0 nova_compute[183177]: 2026-01-26 19:50:40.273 183181 DEBUG oslo_concurrency.lockutils [req-15feea84-9d04-4dff-9c62-ddae7136901a req-beacf1e0-c314-4739-90d4-5263fba7007d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-72f54dad-db6d-4869-ab9d-ff6464876dc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:50:40 compute-0 nova_compute[183177]: 2026-01-26 19:50:40.274 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquired lock "refresh_cache-72f54dad-db6d-4869-ab9d-ff6464876dc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:50:40 compute-0 nova_compute[183177]: 2026-01-26 19:50:40.274 183181 DEBUG nova.network.neutron [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:50:41 compute-0 podman[209366]: 2026-01-26 19:50:41.353239375 +0000 UTC m=+0.092100391 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible)
Jan 26 19:50:41 compute-0 podman[209367]: 2026-01-26 19:50:41.354081907 +0000 UTC m=+0.088405191 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 19:50:41 compute-0 sshd-session[209364]: Invalid user ubuntu from 112.119.212.162 port 55576
Jan 26 19:50:41 compute-0 nova_compute[183177]: 2026-01-26 19:50:41.615 183181 DEBUG nova.network.neutron [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:50:42 compute-0 nova_compute[183177]: 2026-01-26 19:50:42.250 183181 WARNING neutronclient.v2_0.client [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:50:42 compute-0 nova_compute[183177]: 2026-01-26 19:50:42.496 183181 DEBUG nova.network.neutron [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Updating instance_info_cache with network_info: [{"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:50:42 compute-0 sshd-session[209364]: error: maximum authentication attempts exceeded for invalid user ubuntu from 112.119.212.162 port 55576 ssh2 [preauth]
Jan 26 19:50:42 compute-0 sshd-session[209364]: Disconnecting invalid user ubuntu 112.119.212.162 port 55576: Too many authentication failures [preauth]
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.003 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Releasing lock "refresh_cache-72f54dad-db6d-4869-ab9d-ff6464876dc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.003 183181 DEBUG nova.compute.manager [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Instance network_info: |[{"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.005 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Start _get_guest_xml network_info=[{"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.010 183181 WARNING nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.011 183181 DEBUG nova.virt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1788206451', uuid='72f54dad-db6d-4869-ab9d-ff6464876dc5'), owner=OwnerMeta(userid='41beaf3eb21246ea94c2701984cf4279', username='tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin', projectid='f91153215558476397ac0fa698028694', projectname='tempest-TestExecuteHostMaintenanceStrategy-651747916'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769457043.0118566) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.016 183181 DEBUG nova.virt.libvirt.host [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.016 183181 DEBUG nova.virt.libvirt.host [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.019 183181 DEBUG nova.virt.libvirt.host [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.020 183181 DEBUG nova.virt.libvirt.host [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.021 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.021 183181 DEBUG nova.virt.hardware [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.022 183181 DEBUG nova.virt.hardware [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.022 183181 DEBUG nova.virt.hardware [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.022 183181 DEBUG nova.virt.hardware [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.022 183181 DEBUG nova.virt.hardware [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.022 183181 DEBUG nova.virt.hardware [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.023 183181 DEBUG nova.virt.hardware [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.023 183181 DEBUG nova.virt.hardware [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.023 183181 DEBUG nova.virt.hardware [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.023 183181 DEBUG nova.virt.hardware [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.024 183181 DEBUG nova.virt.hardware [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.027 183181 DEBUG nova.virt.libvirt.vif [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:50:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1788206451',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1788206451',id=14,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f91153215558476397ac0fa698028694',ramdisk_id='',reservation_id='r-rdud8twg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-651747916',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:50:36Z,user_data=None,user_id='41beaf3eb21246ea94c2701984cf4279',uuid=72f54dad-db6d-4869-ab9d-ff6464876dc5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.028 183181 DEBUG nova.network.os_vif_util [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Converting VIF {"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.028 183181 DEBUG nova.network.os_vif_util [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:7c:80,bridge_name='br-int',has_traffic_filtering=True,id=77fc680a-e843-40d6-8230-7edc80f67312,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fc680a-e8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.029 183181 DEBUG nova.objects.instance [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lazy-loading 'pci_devices' on Instance uuid 72f54dad-db6d-4869-ab9d-ff6464876dc5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.537 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] End _get_guest_xml xml=<domain type="kvm">
Jan 26 19:50:43 compute-0 nova_compute[183177]:   <uuid>72f54dad-db6d-4869-ab9d-ff6464876dc5</uuid>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   <name>instance-0000000e</name>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1788206451</nova:name>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:50:43</nova:creationTime>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:50:43 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:50:43 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:user uuid="41beaf3eb21246ea94c2701984cf4279">tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin</nova:user>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:project uuid="f91153215558476397ac0fa698028694">tempest-TestExecuteHostMaintenanceStrategy-651747916</nova:project>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         <nova:port uuid="77fc680a-e843-40d6-8230-7edc80f67312">
Jan 26 19:50:43 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <system>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <entry name="serial">72f54dad-db6d-4869-ab9d-ff6464876dc5</entry>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <entry name="uuid">72f54dad-db6d-4869-ab9d-ff6464876dc5</entry>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     </system>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   <os>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   </os>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   <features>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   </features>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk.config"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:a2:7c:80"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <target dev="tap77fc680a-e8"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/console.log" append="off"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <video>
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     </video>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:50:43 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:50:43 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:50:43 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:50:43 compute-0 nova_compute[183177]: </domain>
Jan 26 19:50:43 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.538 183181 DEBUG nova.compute.manager [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Preparing to wait for external event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.539 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.540 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.540 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.541 183181 DEBUG nova.virt.libvirt.vif [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:50:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1788206451',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1788206451',id=14,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f91153215558476397ac0fa698028694',ramdisk_id='',reservation_id='r-rdud8twg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-651747916',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:50:36Z,user_data=None,user_id='41beaf3eb21246ea94c2701984cf4279',uuid=72f54dad-db6d-4869-ab9d-ff6464876dc5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.542 183181 DEBUG nova.network.os_vif_util [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Converting VIF {"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.543 183181 DEBUG nova.network.os_vif_util [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:7c:80,bridge_name='br-int',has_traffic_filtering=True,id=77fc680a-e843-40d6-8230-7edc80f67312,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fc680a-e8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.544 183181 DEBUG os_vif [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:7c:80,bridge_name='br-int',has_traffic_filtering=True,id=77fc680a-e843-40d6-8230-7edc80f67312,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fc680a-e8') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.545 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.545 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.546 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.547 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.548 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '2420686e-ee67-576e-afa0-246beb318c09', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.580 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.582 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.586 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.586 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap77fc680a-e8, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.587 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap77fc680a-e8, col_values=(('qos', UUID('37d49726-a440-463f-b10c-f0ac0fab80ec')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.587 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap77fc680a-e8, col_values=(('external_ids', {'iface-id': '77fc680a-e843-40d6-8230-7edc80f67312', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:7c:80', 'vm-uuid': '72f54dad-db6d-4869-ab9d-ff6464876dc5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.589 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:43 compute-0 NetworkManager[55489]: <info>  [1769457043.5913] manager: (tap77fc680a-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.592 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.599 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:43 compute-0 nova_compute[183177]: 2026-01-26 19:50:43.601 183181 INFO os_vif [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:7c:80,bridge_name='br-int',has_traffic_filtering=True,id=77fc680a-e843-40d6-8230-7edc80f67312,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fc680a-e8')
Jan 26 19:50:44 compute-0 sshd-session[209407]: Invalid user ubuntu from 112.119.212.162 port 56104
Jan 26 19:50:45 compute-0 nova_compute[183177]: 2026-01-26 19:50:45.111 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:45 compute-0 nova_compute[183177]: 2026-01-26 19:50:45.151 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:50:45 compute-0 nova_compute[183177]: 2026-01-26 19:50:45.152 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:50:45 compute-0 nova_compute[183177]: 2026-01-26 19:50:45.152 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] No VIF found with MAC fa:16:3e:a2:7c:80, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 19:50:45 compute-0 nova_compute[183177]: 2026-01-26 19:50:45.153 183181 INFO nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Using config drive
Jan 26 19:50:45 compute-0 sshd-session[209407]: Received disconnect from 112.119.212.162 port 56104:11: disconnected by user [preauth]
Jan 26 19:50:45 compute-0 sshd-session[209407]: Disconnected from invalid user ubuntu 112.119.212.162 port 56104 [preauth]
Jan 26 19:50:45 compute-0 nova_compute[183177]: 2026-01-26 19:50:45.670 183181 WARNING neutronclient.v2_0.client [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:50:45 compute-0 nova_compute[183177]: 2026-01-26 19:50:45.912 183181 INFO nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Creating config drive at /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk.config
Jan 26 19:50:45 compute-0 nova_compute[183177]: 2026-01-26 19:50:45.918 183181 DEBUG oslo_concurrency.processutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpdeut8ifz execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:50:46 compute-0 nova_compute[183177]: 2026-01-26 19:50:46.057 183181 DEBUG oslo_concurrency.processutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpdeut8ifz" returned: 0 in 0.138s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:50:46 compute-0 kernel: tap77fc680a-e8: entered promiscuous mode
Jan 26 19:50:46 compute-0 NetworkManager[55489]: <info>  [1769457046.1459] manager: (tap77fc680a-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Jan 26 19:50:46 compute-0 nova_compute[183177]: 2026-01-26 19:50:46.146 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:46 compute-0 ovn_controller[95396]: 2026-01-26T19:50:46Z|00117|binding|INFO|Claiming lport 77fc680a-e843-40d6-8230-7edc80f67312 for this chassis.
Jan 26 19:50:46 compute-0 ovn_controller[95396]: 2026-01-26T19:50:46Z|00118|binding|INFO|77fc680a-e843-40d6-8230-7edc80f67312: Claiming fa:16:3e:a2:7c:80 10.100.0.14
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.172 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:7c:80 10.100.0.14'], port_security=['fa:16:3e:a2:7c:80 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '72f54dad-db6d-4869-ab9d-ff6464876dc5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f91153215558476397ac0fa698028694', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dcc3a434-5bc5-4cb2-8878-ad4556ff41ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bd82498-4e86-414e-9a6d-c217ab314723, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=77fc680a-e843-40d6-8230-7edc80f67312) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:50:46 compute-0 ovn_controller[95396]: 2026-01-26T19:50:46Z|00119|binding|INFO|Setting lport 77fc680a-e843-40d6-8230-7edc80f67312 ovn-installed in OVS
Jan 26 19:50:46 compute-0 ovn_controller[95396]: 2026-01-26T19:50:46Z|00120|binding|INFO|Setting lport 77fc680a-e843-40d6-8230-7edc80f67312 up in Southbound
Jan 26 19:50:46 compute-0 nova_compute[183177]: 2026-01-26 19:50:46.173 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.174 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 77fc680a-e843-40d6-8230-7edc80f67312 in datapath 147aa3ea-66ec-4250-9408-de2c9a19f4fa bound to our chassis
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.176 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 147aa3ea-66ec-4250-9408-de2c9a19f4fa
Jan 26 19:50:46 compute-0 nova_compute[183177]: 2026-01-26 19:50:46.175 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:46 compute-0 systemd-udevd[209428]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.194 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[50924f95-2343-4fb0-ace6-ef6fa00aaaef]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.195 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap147aa3ea-61 in ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.197 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap147aa3ea-60 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.197 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[bc006ec6-5d5f-47bb-b24b-98c137acc27e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.198 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[85eb15d8-2f07-4941-ba9d-97780adc6601]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 systemd-machined[154465]: New machine qemu-10-instance-0000000e.
Jan 26 19:50:46 compute-0 NetworkManager[55489]: <info>  [1769457046.2068] device (tap77fc680a-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:50:46 compute-0 NetworkManager[55489]: <info>  [1769457046.2080] device (tap77fc680a-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.212 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[451c2e50-3a20-474e-8c88-ee14aa02d4e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000e.
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.232 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9e78ec70-c3fa-427c-a22e-dd8bcf8d6d23]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.263 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[3150b9d7-a821-4671-aec3-3299d66b75c4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.266 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[056e7532-3496-4f2a-ba0d-da5d048f823e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 NetworkManager[55489]: <info>  [1769457046.2688] manager: (tap147aa3ea-60): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.300 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[c9567835-8625-49d8-8309-9ca86a7bf640]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.303 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[a4fa5fae-b978-483c-aaae-5c3cb105d092]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 NetworkManager[55489]: <info>  [1769457046.3343] device (tap147aa3ea-60): carrier: link connected
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.343 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd3e3d8-3646-4098-b0c4-8769a2880897]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.373 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e9f36cdc-8b66-4d08-8af4-e61f308f87b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap147aa3ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:f3:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450522, 'reachable_time': 38956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209464, 'error': None, 'target': 'ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.396 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[56ad404a-66ac-41d3-bc04-ab534ccd6157]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:f31e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450522, 'tstamp': 450522}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209465, 'error': None, 'target': 'ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.422 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[69214849-a73b-406a-af77-f2b8640c600b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap147aa3ea-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:f3:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450522, 'reachable_time': 38956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209466, 'error': None, 'target': 'ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.474 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[83b39900-c299-4689-b28d-8c983bc636d7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.577 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8a73584e-0153-48db-84d6-5e12fc471b16]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.578 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap147aa3ea-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.579 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.580 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap147aa3ea-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:50:46 compute-0 nova_compute[183177]: 2026-01-26 19:50:46.582 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:46 compute-0 kernel: tap147aa3ea-60: entered promiscuous mode
Jan 26 19:50:46 compute-0 NetworkManager[55489]: <info>  [1769457046.5841] manager: (tap147aa3ea-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Jan 26 19:50:46 compute-0 nova_compute[183177]: 2026-01-26 19:50:46.584 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.586 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap147aa3ea-60, col_values=(('external_ids', {'iface-id': 'b6204483-01c7-496e-85f4-be5264700777'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:50:46 compute-0 ovn_controller[95396]: 2026-01-26T19:50:46Z|00121|binding|INFO|Releasing lport b6204483-01c7-496e-85f4-be5264700777 from this chassis (sb_readonly=0)
Jan 26 19:50:46 compute-0 nova_compute[183177]: 2026-01-26 19:50:46.587 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:46 compute-0 nova_compute[183177]: 2026-01-26 19:50:46.610 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.612 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[7b35049d-e621-4ac2-8171-aee9bb3f9fcd]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.613 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.614 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.614 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 147aa3ea-66ec-4250-9408-de2c9a19f4fa disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.614 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.615 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d598d74e-541e-43ef-9048-edf78b15f0cb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.615 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.616 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4c02bb-5f55-4765-918c-4d0523552b0b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.617 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: global
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-147aa3ea-66ec-4250-9408-de2c9a19f4fa
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID 147aa3ea-66ec-4250-9408-de2c9a19f4fa
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 19:50:46 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:46.617 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'env', 'PROCESS_TAG=haproxy-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/147aa3ea-66ec-4250-9408-de2c9a19f4fa.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 19:50:47 compute-0 podman[209504]: 2026-01-26 19:50:47.057359221 +0000 UTC m=+0.042121505 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 19:50:47 compute-0 podman[209504]: 2026-01-26 19:50:47.209506977 +0000 UTC m=+0.194269221 container create c629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:50:47 compute-0 systemd[1]: Started libpod-conmon-c629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5.scope.
Jan 26 19:50:47 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.286 183181 DEBUG nova.compute.manager [req-d42948a2-64f2-4ee3-bbe3-09e764b7fc48 req-63f00b12-8360-4c04-8e2c-b85a0fdb7c0a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.288 183181 DEBUG oslo_concurrency.lockutils [req-d42948a2-64f2-4ee3-bbe3-09e764b7fc48 req-63f00b12-8360-4c04-8e2c-b85a0fdb7c0a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.288 183181 DEBUG oslo_concurrency.lockutils [req-d42948a2-64f2-4ee3-bbe3-09e764b7fc48 req-63f00b12-8360-4c04-8e2c-b85a0fdb7c0a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.289 183181 DEBUG oslo_concurrency.lockutils [req-d42948a2-64f2-4ee3-bbe3-09e764b7fc48 req-63f00b12-8360-4c04-8e2c-b85a0fdb7c0a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.289 183181 DEBUG nova.compute.manager [req-d42948a2-64f2-4ee3-bbe3-09e764b7fc48 req-63f00b12-8360-4c04-8e2c-b85a0fdb7c0a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Processing event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.291 183181 DEBUG nova.compute.manager [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:50:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59725f49e03bf621a9150f3241b72ca5e4e17429a95aa71bea4fd37aacdaae53/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.306 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.314 183181 INFO nova.virt.libvirt.driver [-] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Instance spawned successfully.
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.315 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 19:50:47 compute-0 podman[209504]: 2026-01-26 19:50:47.320801794 +0000 UTC m=+0.305564078 container init c629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 19:50:47 compute-0 podman[209504]: 2026-01-26 19:50:47.325774208 +0000 UTC m=+0.310536452 container start c629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 19:50:47 compute-0 podman[209518]: 2026-01-26 19:50:47.3466802 +0000 UTC m=+0.081189057 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 19:50:47 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[209521]: [NOTICE]   (209541) : New worker (209550) forked
Jan 26 19:50:47 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[209521]: [NOTICE]   (209541) : Loading success.
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.841 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.842 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.843 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.843 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.844 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:50:47 compute-0 nova_compute[183177]: 2026-01-26 19:50:47.844 183181 DEBUG nova.virt.libvirt.driver [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:50:48 compute-0 sshd-session[209411]: Invalid user pi from 112.119.212.162 port 56478
Jan 26 19:50:48 compute-0 nova_compute[183177]: 2026-01-26 19:50:48.357 183181 INFO nova.compute.manager [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Took 10.71 seconds to spawn the instance on the hypervisor.
Jan 26 19:50:48 compute-0 nova_compute[183177]: 2026-01-26 19:50:48.358 183181 DEBUG nova.compute.manager [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 19:50:48 compute-0 nova_compute[183177]: 2026-01-26 19:50:48.590 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:48 compute-0 nova_compute[183177]: 2026-01-26 19:50:48.900 183181 INFO nova.compute.manager [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Took 15.95 seconds to build instance.
Jan 26 19:50:49 compute-0 sshd-session[209411]: Received disconnect from 112.119.212.162 port 56478:11: disconnected by user [preauth]
Jan 26 19:50:49 compute-0 sshd-session[209411]: Disconnected from invalid user pi 112.119.212.162 port 56478 [preauth]
Jan 26 19:50:49 compute-0 nova_compute[183177]: 2026-01-26 19:50:49.361 183181 DEBUG nova.compute.manager [req-011c64c2-c589-4e00-a579-5e2e2233abd3 req-d83bd81d-2140-4b14-80e1-357b0ddbfab0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:50:49 compute-0 nova_compute[183177]: 2026-01-26 19:50:49.362 183181 DEBUG oslo_concurrency.lockutils [req-011c64c2-c589-4e00-a579-5e2e2233abd3 req-d83bd81d-2140-4b14-80e1-357b0ddbfab0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:50:49 compute-0 nova_compute[183177]: 2026-01-26 19:50:49.362 183181 DEBUG oslo_concurrency.lockutils [req-011c64c2-c589-4e00-a579-5e2e2233abd3 req-d83bd81d-2140-4b14-80e1-357b0ddbfab0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:50:49 compute-0 nova_compute[183177]: 2026-01-26 19:50:49.363 183181 DEBUG oslo_concurrency.lockutils [req-011c64c2-c589-4e00-a579-5e2e2233abd3 req-d83bd81d-2140-4b14-80e1-357b0ddbfab0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:49 compute-0 nova_compute[183177]: 2026-01-26 19:50:49.363 183181 DEBUG nova.compute.manager [req-011c64c2-c589-4e00-a579-5e2e2233abd3 req-d83bd81d-2140-4b14-80e1-357b0ddbfab0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] No waiting events found dispatching network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:50:49 compute-0 nova_compute[183177]: 2026-01-26 19:50:49.363 183181 WARNING nova.compute.manager [req-011c64c2-c589-4e00-a579-5e2e2233abd3 req-d83bd81d-2140-4b14-80e1-357b0ddbfab0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received unexpected event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 for instance with vm_state active and task_state None.
Jan 26 19:50:49 compute-0 nova_compute[183177]: 2026-01-26 19:50:49.411 183181 DEBUG oslo_concurrency.lockutils [None req-5206334c-8801-48b5-9ee4-9f3a0b0a5520 41beaf3eb21246ea94c2701984cf4279 f91153215558476397ac0fa698028694 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.475s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:50 compute-0 nova_compute[183177]: 2026-01-26 19:50:50.114 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:51 compute-0 nova_compute[183177]: 2026-01-26 19:50:51.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:50:51 compute-0 sshd-session[209560]: Invalid user baikal from 112.119.212.162 port 56992
Jan 26 19:50:51 compute-0 sshd-session[209560]: Received disconnect from 112.119.212.162 port 56992:11: disconnected by user [preauth]
Jan 26 19:50:51 compute-0 sshd-session[209560]: Disconnected from invalid user baikal 112.119.212.162 port 56992 [preauth]
Jan 26 19:50:53 compute-0 nova_compute[183177]: 2026-01-26 19:50:53.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:50:53 compute-0 nova_compute[183177]: 2026-01-26 19:50:53.621 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:55 compute-0 nova_compute[183177]: 2026-01-26 19:50:55.118 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:56 compute-0 nova_compute[183177]: 2026-01-26 19:50:56.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:50:56 compute-0 nova_compute[183177]: 2026-01-26 19:50:56.155 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:50:58 compute-0 nova_compute[183177]: 2026-01-26 19:50:58.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:50:58 compute-0 nova_compute[183177]: 2026-01-26 19:50:58.664 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:59 compute-0 nova_compute[183177]: 2026-01-26 19:50:59.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:50:59 compute-0 nova_compute[183177]: 2026-01-26 19:50:59.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:50:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:59.649 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:50:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:50:59.650 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:50:59 compute-0 nova_compute[183177]: 2026-01-26 19:50:59.659 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:50:59 compute-0 nova_compute[183177]: 2026-01-26 19:50:59.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:50:59 compute-0 nova_compute[183177]: 2026-01-26 19:50:59.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:50:59 compute-0 nova_compute[183177]: 2026-01-26 19:50:59.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:50:59 compute-0 nova_compute[183177]: 2026-01-26 19:50:59.670 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:50:59 compute-0 podman[192499]: time="2026-01-26T19:50:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:50:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:50:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:50:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:50:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Jan 26 19:51:00 compute-0 ovn_controller[95396]: 2026-01-26T19:51:00Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a2:7c:80 10.100.0.14
Jan 26 19:51:00 compute-0 ovn_controller[95396]: 2026-01-26T19:51:00Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a2:7c:80 10.100.0.14
Jan 26 19:51:00 compute-0 nova_compute[183177]: 2026-01-26 19:51:00.120 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:00 compute-0 nova_compute[183177]: 2026-01-26 19:51:00.725 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:51:00 compute-0 nova_compute[183177]: 2026-01-26 19:51:00.811 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:51:00 compute-0 nova_compute[183177]: 2026-01-26 19:51:00.812 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:51:00 compute-0 nova_compute[183177]: 2026-01-26 19:51:00.874 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:51:01 compute-0 nova_compute[183177]: 2026-01-26 19:51:01.071 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:51:01 compute-0 nova_compute[183177]: 2026-01-26 19:51:01.072 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:51:01 compute-0 nova_compute[183177]: 2026-01-26 19:51:01.091 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:51:01 compute-0 nova_compute[183177]: 2026-01-26 19:51:01.091 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5625MB free_disk=73.07087326049805GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:51:01 compute-0 nova_compute[183177]: 2026-01-26 19:51:01.092 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:01 compute-0 nova_compute[183177]: 2026-01-26 19:51:01.092 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:01 compute-0 openstack_network_exporter[195363]: ERROR   19:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:51:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:51:01 compute-0 openstack_network_exporter[195363]: ERROR   19:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:51:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:51:02 compute-0 nova_compute[183177]: 2026-01-26 19:51:02.149 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 72f54dad-db6d-4869-ab9d-ff6464876dc5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:51:02 compute-0 nova_compute[183177]: 2026-01-26 19:51:02.151 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:51:02 compute-0 nova_compute[183177]: 2026-01-26 19:51:02.151 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:51:01 up  1:15,  0 user,  load average: 0.34, 0.22, 0.34\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_f91153215558476397ac0fa698028694': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:51:02 compute-0 nova_compute[183177]: 2026-01-26 19:51:02.198 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:51:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:02.653 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:51:02 compute-0 nova_compute[183177]: 2026-01-26 19:51:02.706 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:51:03 compute-0 nova_compute[183177]: 2026-01-26 19:51:03.218 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:51:03 compute-0 nova_compute[183177]: 2026-01-26 19:51:03.219 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.127s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:03 compute-0 nova_compute[183177]: 2026-01-26 19:51:03.666 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:05 compute-0 nova_compute[183177]: 2026-01-26 19:51:05.124 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:06 compute-0 nova_compute[183177]: 2026-01-26 19:51:06.219 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:51:06 compute-0 nova_compute[183177]: 2026-01-26 19:51:06.220 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:51:08 compute-0 nova_compute[183177]: 2026-01-26 19:51:08.709 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:09 compute-0 podman[209591]: 2026-01-26 19:51:09.40364872 +0000 UTC m=+0.140370440 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 19:51:10 compute-0 nova_compute[183177]: 2026-01-26 19:51:10.127 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:12 compute-0 podman[209617]: 2026-01-26 19:51:12.351871347 +0000 UTC m=+0.089889221 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 19:51:12 compute-0 podman[209618]: 2026-01-26 19:51:12.38204628 +0000 UTC m=+0.114749131 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:51:13 compute-0 nova_compute[183177]: 2026-01-26 19:51:13.713 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:15 compute-0 nova_compute[183177]: 2026-01-26 19:51:15.129 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:18 compute-0 podman[209654]: 2026-01-26 19:51:18.341812909 +0000 UTC m=+0.083294534 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 19:51:18 compute-0 nova_compute[183177]: 2026-01-26 19:51:18.716 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:20 compute-0 nova_compute[183177]: 2026-01-26 19:51:20.132 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:22 compute-0 nova_compute[183177]: 2026-01-26 19:51:22.706 183181 DEBUG nova.compute.manager [None req-4fe2c61f-7f31-417b-b439-1c22eff2063d 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Jan 26 19:51:22 compute-0 nova_compute[183177]: 2026-01-26 19:51:22.778 183181 DEBUG nova.compute.provider_tree [None req-4fe2c61f-7f31-417b-b439-1c22eff2063d 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Updating resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 generation from 22 to 24 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 19:51:23 compute-0 nova_compute[183177]: 2026-01-26 19:51:23.719 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:24.060 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:24.061 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:24.061 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:25 compute-0 nova_compute[183177]: 2026-01-26 19:51:25.136 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:28 compute-0 nova_compute[183177]: 2026-01-26 19:51:28.726 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:29 compute-0 podman[192499]: time="2026-01-26T19:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:51:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:51:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2643 "" "Go-http-client/1.1"
Jan 26 19:51:30 compute-0 nova_compute[183177]: 2026-01-26 19:51:30.065 183181 DEBUG nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Check if temp file /var/lib/nova/instances/tmppe6qtyq7 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 19:51:30 compute-0 nova_compute[183177]: 2026-01-26 19:51:30.074 183181 DEBUG nova.compute.manager [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe6qtyq7',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='72f54dad-db6d-4869-ab9d-ff6464876dc5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 19:51:30 compute-0 nova_compute[183177]: 2026-01-26 19:51:30.137 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:31 compute-0 openstack_network_exporter[195363]: ERROR   19:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:51:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:51:31 compute-0 openstack_network_exporter[195363]: ERROR   19:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:51:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:51:33 compute-0 nova_compute[183177]: 2026-01-26 19:51:33.729 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:34 compute-0 nova_compute[183177]: 2026-01-26 19:51:34.645 183181 DEBUG oslo_concurrency.processutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:51:34 compute-0 nova_compute[183177]: 2026-01-26 19:51:34.727 183181 DEBUG oslo_concurrency.processutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:51:34 compute-0 nova_compute[183177]: 2026-01-26 19:51:34.729 183181 DEBUG oslo_concurrency.processutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:51:34 compute-0 nova_compute[183177]: 2026-01-26 19:51:34.791 183181 DEBUG oslo_concurrency.processutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:51:34 compute-0 nova_compute[183177]: 2026-01-26 19:51:34.793 183181 DEBUG nova.compute.manager [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Preparing to wait for external event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:51:34 compute-0 nova_compute[183177]: 2026-01-26 19:51:34.794 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:34 compute-0 nova_compute[183177]: 2026-01-26 19:51:34.794 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:34 compute-0 nova_compute[183177]: 2026-01-26 19:51:34.795 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:35 compute-0 nova_compute[183177]: 2026-01-26 19:51:35.139 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:38 compute-0 nova_compute[183177]: 2026-01-26 19:51:38.732 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:40 compute-0 nova_compute[183177]: 2026-01-26 19:51:40.141 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:40 compute-0 podman[209688]: 2026-01-26 19:51:40.38002944 +0000 UTC m=+0.122824148 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 19:51:41 compute-0 nova_compute[183177]: 2026-01-26 19:51:41.878 183181 DEBUG nova.compute.manager [req-6c620b1a-ac00-49ae-831d-ce4fbc5c1ebf req-7635cc8a-3e47-496b-a9d8-5df25aa18c34 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-unplugged-77fc680a-e843-40d6-8230-7edc80f67312 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:51:41 compute-0 nova_compute[183177]: 2026-01-26 19:51:41.879 183181 DEBUG oslo_concurrency.lockutils [req-6c620b1a-ac00-49ae-831d-ce4fbc5c1ebf req-7635cc8a-3e47-496b-a9d8-5df25aa18c34 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:41 compute-0 nova_compute[183177]: 2026-01-26 19:51:41.879 183181 DEBUG oslo_concurrency.lockutils [req-6c620b1a-ac00-49ae-831d-ce4fbc5c1ebf req-7635cc8a-3e47-496b-a9d8-5df25aa18c34 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:41 compute-0 nova_compute[183177]: 2026-01-26 19:51:41.880 183181 DEBUG oslo_concurrency.lockutils [req-6c620b1a-ac00-49ae-831d-ce4fbc5c1ebf req-7635cc8a-3e47-496b-a9d8-5df25aa18c34 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:41 compute-0 nova_compute[183177]: 2026-01-26 19:51:41.880 183181 DEBUG nova.compute.manager [req-6c620b1a-ac00-49ae-831d-ce4fbc5c1ebf req-7635cc8a-3e47-496b-a9d8-5df25aa18c34 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] No event matching network-vif-unplugged-77fc680a-e843-40d6-8230-7edc80f67312 in dict_keys([('network-vif-plugged', '77fc680a-e843-40d6-8230-7edc80f67312')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 19:51:41 compute-0 nova_compute[183177]: 2026-01-26 19:51:41.881 183181 DEBUG nova.compute.manager [req-6c620b1a-ac00-49ae-831d-ce4fbc5c1ebf req-7635cc8a-3e47-496b-a9d8-5df25aa18c34 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-unplugged-77fc680a-e843-40d6-8230-7edc80f67312 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:51:41 compute-0 ovn_controller[95396]: 2026-01-26T19:51:41Z|00122|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 19:51:42 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:42.306 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:51:42 compute-0 nova_compute[183177]: 2026-01-26 19:51:42.307 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:42 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:42.307 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:51:43 compute-0 podman[209715]: 2026-01-26 19:51:43.329524332 +0000 UTC m=+0.068667420 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Jan 26 19:51:43 compute-0 podman[209716]: 2026-01-26 19:51:43.332913633 +0000 UTC m=+0.064176878 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 19:51:43 compute-0 nova_compute[183177]: 2026-01-26 19:51:43.736 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:43 compute-0 nova_compute[183177]: 2026-01-26 19:51:43.823 183181 INFO nova.compute.manager [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Took 9.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 26 19:51:43 compute-0 nova_compute[183177]: 2026-01-26 19:51:43.990 183181 DEBUG nova.compute.manager [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:51:43 compute-0 nova_compute[183177]: 2026-01-26 19:51:43.990 183181 DEBUG oslo_concurrency.lockutils [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:43 compute-0 nova_compute[183177]: 2026-01-26 19:51:43.991 183181 DEBUG oslo_concurrency.lockutils [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:43 compute-0 nova_compute[183177]: 2026-01-26 19:51:43.991 183181 DEBUG oslo_concurrency.lockutils [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:43 compute-0 nova_compute[183177]: 2026-01-26 19:51:43.991 183181 DEBUG nova.compute.manager [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Processing event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:51:43 compute-0 nova_compute[183177]: 2026-01-26 19:51:43.991 183181 DEBUG nova.compute.manager [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-changed-77fc680a-e843-40d6-8230-7edc80f67312 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:51:43 compute-0 nova_compute[183177]: 2026-01-26 19:51:43.991 183181 DEBUG nova.compute.manager [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Refreshing instance network info cache due to event network-changed-77fc680a-e843-40d6-8230-7edc80f67312. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:51:43 compute-0 nova_compute[183177]: 2026-01-26 19:51:43.992 183181 DEBUG oslo_concurrency.lockutils [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-72f54dad-db6d-4869-ab9d-ff6464876dc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:51:43 compute-0 nova_compute[183177]: 2026-01-26 19:51:43.992 183181 DEBUG oslo_concurrency.lockutils [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-72f54dad-db6d-4869-ab9d-ff6464876dc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:51:43 compute-0 nova_compute[183177]: 2026-01-26 19:51:43.992 183181 DEBUG nova.network.neutron [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Refreshing network info cache for port 77fc680a-e843-40d6-8230-7edc80f67312 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:51:43 compute-0 nova_compute[183177]: 2026-01-26 19:51:43.993 183181 DEBUG nova.compute.manager [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:51:44 compute-0 nova_compute[183177]: 2026-01-26 19:51:44.500 183181 WARNING neutronclient.v2_0.client [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:51:44 compute-0 nova_compute[183177]: 2026-01-26 19:51:44.510 183181 DEBUG nova.compute.manager [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe6qtyq7',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='72f54dad-db6d-4869-ab9d-ff6464876dc5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(994d99b0-481a-432f-9192-8758145e3308),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.029 183181 DEBUG nova.objects.instance [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid 72f54dad-db6d-4869-ab9d-ff6464876dc5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.031 183181 DEBUG nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.034 183181 DEBUG nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.034 183181 DEBUG nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.053 183181 WARNING neutronclient.v2_0.client [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.144 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.312 183181 DEBUG nova.network.neutron [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Updated VIF entry in instance network info cache for port 77fc680a-e843-40d6-8230-7edc80f67312. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.313 183181 DEBUG nova.network.neutron [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Updating instance_info_cache with network_info: [{"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.538 183181 DEBUG nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.539 183181 DEBUG nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.548 183181 DEBUG nova.virt.libvirt.vif [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:50:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1788206451',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1788206451',id=14,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:50:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f91153215558476397ac0fa698028694',ramdisk_id='',reservation_id='r-rdud8twg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-651747916',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:50:48Z,user_data=None,user_id='41beaf3eb21246ea94c2701984cf4279',uuid=72f54dad-db6d-4869-ab9d-ff6464876dc5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.548 183181 DEBUG nova.network.os_vif_util [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.549 183181 DEBUG nova.network.os_vif_util [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:7c:80,bridge_name='br-int',has_traffic_filtering=True,id=77fc680a-e843-40d6-8230-7edc80f67312,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fc680a-e8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.550 183181 DEBUG nova.virt.libvirt.migration [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <mac address="fa:16:3e:a2:7c:80"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <model type="virtio"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <mtu size="1442"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <target dev="tap77fc680a-e8"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]: </interface>
Jan 26 19:51:45 compute-0 nova_compute[183177]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.552 183181 DEBUG nova.virt.libvirt.migration [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <name>instance-0000000e</name>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <uuid>72f54dad-db6d-4869-ab9d-ff6464876dc5</uuid>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1788206451</nova:name>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:50:43</nova:creationTime>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:51:45 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:51:45 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:user uuid="41beaf3eb21246ea94c2701984cf4279">tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin</nova:user>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:project uuid="f91153215558476397ac0fa698028694">tempest-TestExecuteHostMaintenanceStrategy-651747916</nova:project>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:port uuid="77fc680a-e843-40d6-8230-7edc80f67312">
Jan 26 19:51:45 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <system>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="serial">72f54dad-db6d-4869-ab9d-ff6464876dc5</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="uuid">72f54dad-db6d-4869-ab9d-ff6464876dc5</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </system>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <os>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </os>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <features>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </features>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk.config"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:a2:7c:80"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap77fc680a-e8"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/console.log" append="off"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </target>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/console.log" append="off"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </console>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </input>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <video>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </video>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]: </domain>
Jan 26 19:51:45 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.553 183181 DEBUG nova.virt.libvirt.migration [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <name>instance-0000000e</name>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <uuid>72f54dad-db6d-4869-ab9d-ff6464876dc5</uuid>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1788206451</nova:name>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:50:43</nova:creationTime>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:51:45 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:51:45 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:user uuid="41beaf3eb21246ea94c2701984cf4279">tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin</nova:user>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:project uuid="f91153215558476397ac0fa698028694">tempest-TestExecuteHostMaintenanceStrategy-651747916</nova:project>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:port uuid="77fc680a-e843-40d6-8230-7edc80f67312">
Jan 26 19:51:45 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <system>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="serial">72f54dad-db6d-4869-ab9d-ff6464876dc5</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="uuid">72f54dad-db6d-4869-ab9d-ff6464876dc5</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </system>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <os>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </os>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <features>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </features>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk.config"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:a2:7c:80"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap77fc680a-e8"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/console.log" append="off"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </target>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/console.log" append="off"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </console>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </input>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <video>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </video>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]: </domain>
Jan 26 19:51:45 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.554 183181 DEBUG nova.virt.libvirt.migration [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <name>instance-0000000e</name>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <uuid>72f54dad-db6d-4869-ab9d-ff6464876dc5</uuid>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1788206451</nova:name>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:50:43</nova:creationTime>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:51:45 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:51:45 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:user uuid="41beaf3eb21246ea94c2701984cf4279">tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin</nova:user>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:project uuid="f91153215558476397ac0fa698028694">tempest-TestExecuteHostMaintenanceStrategy-651747916</nova:project>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <nova:port uuid="77fc680a-e843-40d6-8230-7edc80f67312">
Jan 26 19:51:45 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <system>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="serial">72f54dad-db6d-4869-ab9d-ff6464876dc5</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="uuid">72f54dad-db6d-4869-ab9d-ff6464876dc5</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </system>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <os>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </os>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <features>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </features>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/disk.config"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:a2:7c:80"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap77fc680a-e8"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/console.log" append="off"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:51:45 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       </target>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5/console.log" append="off"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </console>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </input>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <video>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </video>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:51:45 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:51:45 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:51:45 compute-0 nova_compute[183177]: </domain>
Jan 26 19:51:45 compute-0 nova_compute[183177]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.555 183181 DEBUG nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 19:51:45 compute-0 nova_compute[183177]: 2026-01-26 19:51:45.822 183181 DEBUG oslo_concurrency.lockutils [req-beb6dc17-0d4f-4d09-a295-577233d3f1e7 req-2a0ff96c-ff5f-4693-8ab2-0dcee521e09b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-72f54dad-db6d-4869-ab9d-ff6464876dc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:51:46 compute-0 nova_compute[183177]: 2026-01-26 19:51:46.041 183181 DEBUG nova.virt.libvirt.migration [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 19:51:46 compute-0 nova_compute[183177]: 2026-01-26 19:51:46.042 183181 INFO nova.virt.libvirt.migration [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 19:51:47 compute-0 nova_compute[183177]: 2026-01-26 19:51:47.061 183181 INFO nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 19:51:47 compute-0 nova_compute[183177]: 2026-01-26 19:51:47.565 183181 DEBUG nova.virt.libvirt.migration [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 19:51:47 compute-0 nova_compute[183177]: 2026-01-26 19:51:47.566 183181 DEBUG nova.virt.libvirt.migration [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 19:51:47 compute-0 kernel: tap77fc680a-e8 (unregistering): left promiscuous mode
Jan 26 19:51:47 compute-0 NetworkManager[55489]: <info>  [1769457107.9040] device (tap77fc680a-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:51:47 compute-0 ovn_controller[95396]: 2026-01-26T19:51:47Z|00123|binding|INFO|Releasing lport 77fc680a-e843-40d6-8230-7edc80f67312 from this chassis (sb_readonly=0)
Jan 26 19:51:47 compute-0 ovn_controller[95396]: 2026-01-26T19:51:47Z|00124|binding|INFO|Setting lport 77fc680a-e843-40d6-8230-7edc80f67312 down in Southbound
Jan 26 19:51:47 compute-0 ovn_controller[95396]: 2026-01-26T19:51:47Z|00125|binding|INFO|Removing iface tap77fc680a-e8 ovn-installed in OVS
Jan 26 19:51:47 compute-0 nova_compute[183177]: 2026-01-26 19:51:47.918 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:47 compute-0 nova_compute[183177]: 2026-01-26 19:51:47.921 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:47 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:47.927 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:7c:80 10.100.0.14'], port_security=['fa:16:3e:a2:7c:80 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c86a094a-ada8-46ad-9c23-c857e9a7b834'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '72f54dad-db6d-4869-ab9d-ff6464876dc5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f91153215558476397ac0fa698028694', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'dcc3a434-5bc5-4cb2-8878-ad4556ff41ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bd82498-4e86-414e-9a6d-c217ab314723, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=77fc680a-e843-40d6-8230-7edc80f67312) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:51:47 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:47.928 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 77fc680a-e843-40d6-8230-7edc80f67312 in datapath 147aa3ea-66ec-4250-9408-de2c9a19f4fa unbound from our chassis
Jan 26 19:51:47 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:47.929 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 147aa3ea-66ec-4250-9408-de2c9a19f4fa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:51:47 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:47.932 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[c66db4cc-e779-4d22-ab5b-9f688103af81]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:51:47 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:47.934 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa namespace which is not needed anymore
Jan 26 19:51:47 compute-0 nova_compute[183177]: 2026-01-26 19:51:47.959 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:47 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 26 19:51:47 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000e.scope: Consumed 16.105s CPU time.
Jan 26 19:51:47 compute-0 systemd-machined[154465]: Machine qemu-10-instance-0000000e terminated.
Jan 26 19:51:48 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[209521]: [NOTICE]   (209541) : haproxy version is 3.0.5-8e879a5
Jan 26 19:51:48 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[209521]: [NOTICE]   (209541) : path to executable is /usr/sbin/haproxy
Jan 26 19:51:48 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[209521]: [WARNING]  (209541) : Exiting Master process...
Jan 26 19:51:48 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[209521]: [ALERT]    (209541) : Current worker (209550) exited with code 143 (Terminated)
Jan 26 19:51:48 compute-0 neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa[209521]: [WARNING]  (209541) : All workers exited. Exiting... (0)
Jan 26 19:51:48 compute-0 podman[209795]: 2026-01-26 19:51:48.12336646 +0000 UTC m=+0.044643663 container kill c629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 19:51:48 compute-0 systemd[1]: libpod-c629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5.scope: Deactivated successfully.
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.150 183181 DEBUG nova.virt.libvirt.guest [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.151 183181 INFO nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Migration operation has completed
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.151 183181 INFO nova.compute.manager [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] _post_live_migration() is started..
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.154 183181 DEBUG nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.154 183181 DEBUG nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.154 183181 DEBUG nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.166 183181 WARNING neutronclient.v2_0.client [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.167 183181 WARNING neutronclient.v2_0.client [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:51:48 compute-0 podman[209829]: 2026-01-26 19:51:48.185258827 +0000 UTC m=+0.030568465 container died c629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:51:48 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5-userdata-shm.mount: Deactivated successfully.
Jan 26 19:51:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-59725f49e03bf621a9150f3241b72ca5e4e17429a95aa71bea4fd37aacdaae53-merged.mount: Deactivated successfully.
Jan 26 19:51:48 compute-0 podman[209829]: 2026-01-26 19:51:48.241526611 +0000 UTC m=+0.086836259 container remove c629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Jan 26 19:51:48 compute-0 systemd[1]: libpod-conmon-c629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5.scope: Deactivated successfully.
Jan 26 19:51:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:48.250 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3bc2919c-a952-4a50-9b2f-2b9caa4a239d]: (4, ("Mon Jan 26 07:51:48 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa (c629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5)\nc629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5\nMon Jan 26 07:51:48 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa (c629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5)\nc629435ba118b34641376cf79273eada4158dde98d43a5d15a8027b685eadfa5\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:51:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:48.253 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[34feef4b-b31c-4ec4-9f01-3a73908bcbc6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:51:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:48.253 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/147aa3ea-66ec-4250-9408-de2c9a19f4fa.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:51:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:48.254 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[765ed39a-dc49-40af-baa3-edc28228564d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:51:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:48.255 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap147aa3ea-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.291 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:48 compute-0 kernel: tap147aa3ea-60: left promiscuous mode
Jan 26 19:51:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:48.309 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.320 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.321 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:48.323 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[67381afa-32f3-45e5-8403-f5496655fbc2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:51:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:48.337 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a77d9a5d-d7d9-4ec0-8853-fc3eda2dd518]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:51:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:48.337 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d9e5e6b7-d81c-434a-8840-bc583b0e2da5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:51:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:48.363 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b7c21e10-a245-459d-834f-48dcc75e5efa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450514, 'reachable_time': 38091, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209862, 'error': None, 'target': 'ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:51:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:48.370 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-147aa3ea-66ec-4250-9408-de2c9a19f4fa deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 19:51:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:51:48.370 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3a3f2f-8901-45a9-8a77-cdcb58db3f48]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:51:48 compute-0 systemd[1]: run-netns-ovnmeta\x2d147aa3ea\x2d66ec\x2d4250\x2d9408\x2dde2c9a19f4fa.mount: Deactivated successfully.
Jan 26 19:51:48 compute-0 podman[209863]: 2026-01-26 19:51:48.474947446 +0000 UTC m=+0.071932467 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.661 183181 DEBUG nova.compute.manager [req-4cf80107-e54c-439a-9710-08d2472ceb51 req-f7699340-ce75-440a-800d-c0e58b6a7292 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-unplugged-77fc680a-e843-40d6-8230-7edc80f67312 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.661 183181 DEBUG oslo_concurrency.lockutils [req-4cf80107-e54c-439a-9710-08d2472ceb51 req-f7699340-ce75-440a-800d-c0e58b6a7292 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.662 183181 DEBUG oslo_concurrency.lockutils [req-4cf80107-e54c-439a-9710-08d2472ceb51 req-f7699340-ce75-440a-800d-c0e58b6a7292 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.662 183181 DEBUG oslo_concurrency.lockutils [req-4cf80107-e54c-439a-9710-08d2472ceb51 req-f7699340-ce75-440a-800d-c0e58b6a7292 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.662 183181 DEBUG nova.compute.manager [req-4cf80107-e54c-439a-9710-08d2472ceb51 req-f7699340-ce75-440a-800d-c0e58b6a7292 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] No waiting events found dispatching network-vif-unplugged-77fc680a-e843-40d6-8230-7edc80f67312 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.663 183181 DEBUG nova.compute.manager [req-4cf80107-e54c-439a-9710-08d2472ceb51 req-f7699340-ce75-440a-800d-c0e58b6a7292 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-unplugged-77fc680a-e843-40d6-8230-7edc80f67312 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.739 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.954 183181 DEBUG nova.network.neutron [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Activated binding for port 77fc680a-e843-40d6-8230-7edc80f67312 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.955 183181 DEBUG nova.compute.manager [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.956 183181 DEBUG nova.virt.libvirt.vif [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:50:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1788206451',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1788206451',id=14,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:50:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f91153215558476397ac0fa698028694',ramdisk_id='',reservation_id='r-rdud8twg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-651747916',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-651747916-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:51:24Z,user_data=None,user_id='41beaf3eb21246ea94c2701984cf4279',uuid=72f54dad-db6d-4869-ab9d-ff6464876dc5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.957 183181 DEBUG nova.network.os_vif_util [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "77fc680a-e843-40d6-8230-7edc80f67312", "address": "fa:16:3e:a2:7c:80", "network": {"id": "147aa3ea-66ec-4250-9408-de2c9a19f4fa", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-971803870-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462f3c2b8ab64156915d1fc496fd2e53", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77fc680a-e8", "ovs_interfaceid": "77fc680a-e843-40d6-8230-7edc80f67312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.958 183181 DEBUG nova.network.os_vif_util [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:7c:80,bridge_name='br-int',has_traffic_filtering=True,id=77fc680a-e843-40d6-8230-7edc80f67312,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fc680a-e8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.959 183181 DEBUG os_vif [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:7c:80,bridge_name='br-int',has_traffic_filtering=True,id=77fc680a-e843-40d6-8230-7edc80f67312,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fc680a-e8') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.962 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.963 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77fc680a-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.965 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.967 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.968 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.970 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.970 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=37d49726-a440-463f-b10c-f0ac0fab80ec) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.971 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.973 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.976 183181 INFO os_vif [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:7c:80,bridge_name='br-int',has_traffic_filtering=True,id=77fc680a-e843-40d6-8230-7edc80f67312,network=Network(147aa3ea-66ec-4250-9408-de2c9a19f4fa),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77fc680a-e8')
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.977 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.977 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.978 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.978 183181 DEBUG nova.compute.manager [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.979 183181 INFO nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Deleting instance files /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5_del
Jan 26 19:51:48 compute-0 nova_compute[183177]: 2026-01-26 19:51:48.980 183181 INFO nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Deletion of /var/lib/nova/instances/72f54dad-db6d-4869-ab9d-ff6464876dc5_del complete
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.146 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.762 183181 DEBUG nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.762 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.763 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.763 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.763 183181 DEBUG nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] No waiting events found dispatching network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.763 183181 WARNING nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received unexpected event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 for instance with vm_state active and task_state migrating.
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.764 183181 DEBUG nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-unplugged-77fc680a-e843-40d6-8230-7edc80f67312 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.764 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.764 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.765 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.765 183181 DEBUG nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] No waiting events found dispatching network-vif-unplugged-77fc680a-e843-40d6-8230-7edc80f67312 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.765 183181 DEBUG nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-unplugged-77fc680a-e843-40d6-8230-7edc80f67312 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.765 183181 DEBUG nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-unplugged-77fc680a-e843-40d6-8230-7edc80f67312 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.766 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.766 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.766 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.767 183181 DEBUG nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] No waiting events found dispatching network-vif-unplugged-77fc680a-e843-40d6-8230-7edc80f67312 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.767 183181 DEBUG nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-unplugged-77fc680a-e843-40d6-8230-7edc80f67312 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.767 183181 DEBUG nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.767 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.768 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.768 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.768 183181 DEBUG nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] No waiting events found dispatching network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.768 183181 WARNING nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received unexpected event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 for instance with vm_state active and task_state migrating.
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.769 183181 DEBUG nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.769 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.769 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.769 183181 DEBUG oslo_concurrency.lockutils [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.770 183181 DEBUG nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] No waiting events found dispatching network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:51:50 compute-0 nova_compute[183177]: 2026-01-26 19:51:50.770 183181 WARNING nova.compute.manager [req-089d1c7a-0084-4ed0-ac0c-1514d2f352d6 req-024343ea-fe0c-4549-9879-cc56b1f4e60b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Received unexpected event network-vif-plugged-77fc680a-e843-40d6-8230-7edc80f67312 for instance with vm_state active and task_state migrating.
Jan 26 19:51:51 compute-0 nova_compute[183177]: 2026-01-26 19:51:51.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:51:53 compute-0 nova_compute[183177]: 2026-01-26 19:51:53.972 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:54 compute-0 nova_compute[183177]: 2026-01-26 19:51:54.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:51:55 compute-0 nova_compute[183177]: 2026-01-26 19:51:55.148 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.023 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.024 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.024 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "72f54dad-db6d-4869-ab9d-ff6464876dc5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.152 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.542 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.543 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.544 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.544 183181 DEBUG nova.compute.resource_tracker [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.778 183181 WARNING nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.781 183181 DEBUG oslo_concurrency.processutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.805 183181 DEBUG oslo_concurrency.processutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.807 183181 DEBUG nova.compute.resource_tracker [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5788MB free_disk=73.09850692749023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.807 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.808 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:51:58 compute-0 nova_compute[183177]: 2026-01-26 19:51:58.974 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:51:59 compute-0 nova_compute[183177]: 2026-01-26 19:51:59.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:51:59 compute-0 nova_compute[183177]: 2026-01-26 19:51:59.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:51:59 compute-0 nova_compute[183177]: 2026-01-26 19:51:59.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:51:59 compute-0 nova_compute[183177]: 2026-01-26 19:51:59.666 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:51:59 compute-0 podman[192499]: time="2026-01-26T19:51:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:51:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:51:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:51:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:51:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Jan 26 19:51:59 compute-0 nova_compute[183177]: 2026-01-26 19:51:59.832 183181 DEBUG nova.compute.resource_tracker [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance 72f54dad-db6d-4869-ab9d-ff6464876dc5 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 19:52:00 compute-0 nova_compute[183177]: 2026-01-26 19:52:00.150 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:00 compute-0 nova_compute[183177]: 2026-01-26 19:52:00.341 183181 DEBUG nova.compute.resource_tracker [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 19:52:00 compute-0 nova_compute[183177]: 2026-01-26 19:52:00.391 183181 DEBUG nova.compute.resource_tracker [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration 994d99b0-481a-432f-9192-8758145e3308 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 19:52:00 compute-0 nova_compute[183177]: 2026-01-26 19:52:00.392 183181 DEBUG nova.compute.resource_tracker [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:52:00 compute-0 nova_compute[183177]: 2026-01-26 19:52:00.392 183181 DEBUG nova.compute.resource_tracker [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:51:58 up  1:16,  0 user,  load average: 0.19, 0.20, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:52:00 compute-0 nova_compute[183177]: 2026-01-26 19:52:00.416 183181 DEBUG nova.scheduler.client.report [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Refreshing inventories for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 19:52:00 compute-0 nova_compute[183177]: 2026-01-26 19:52:00.448 183181 DEBUG nova.scheduler.client.report [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Updating ProviderTree inventory for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 19:52:00 compute-0 nova_compute[183177]: 2026-01-26 19:52:00.448 183181 DEBUG nova.compute.provider_tree [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 19:52:00 compute-0 nova_compute[183177]: 2026-01-26 19:52:00.465 183181 DEBUG nova.scheduler.client.report [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Refreshing aggregate associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 19:52:00 compute-0 nova_compute[183177]: 2026-01-26 19:52:00.487 183181 DEBUG nova.scheduler.client.report [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Refreshing trait associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_SCSI,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STATUS_DISABLED,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 19:52:00 compute-0 nova_compute[183177]: 2026-01-26 19:52:00.528 183181 DEBUG nova.compute.provider_tree [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:52:01 compute-0 nova_compute[183177]: 2026-01-26 19:52:01.036 183181 DEBUG nova.scheduler.client.report [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:52:01 compute-0 openstack_network_exporter[195363]: ERROR   19:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:52:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:52:01 compute-0 openstack_network_exporter[195363]: ERROR   19:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:52:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:52:01 compute-0 nova_compute[183177]: 2026-01-26 19:52:01.549 183181 DEBUG nova.compute.resource_tracker [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:52:01 compute-0 nova_compute[183177]: 2026-01-26 19:52:01.550 183181 DEBUG oslo_concurrency.lockutils [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.742s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:52:01 compute-0 nova_compute[183177]: 2026-01-26 19:52:01.555 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 1.889s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:52:01 compute-0 nova_compute[183177]: 2026-01-26 19:52:01.555 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:52:01 compute-0 nova_compute[183177]: 2026-01-26 19:52:01.555 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:52:01 compute-0 nova_compute[183177]: 2026-01-26 19:52:01.576 183181 INFO nova.compute.manager [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 26 19:52:01 compute-0 nova_compute[183177]: 2026-01-26 19:52:01.768 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:52:01 compute-0 nova_compute[183177]: 2026-01-26 19:52:01.769 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:52:01 compute-0 nova_compute[183177]: 2026-01-26 19:52:01.797 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:52:01 compute-0 nova_compute[183177]: 2026-01-26 19:52:01.798 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5789MB free_disk=73.09850692749023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:52:01 compute-0 nova_compute[183177]: 2026-01-26 19:52:01.798 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:52:01 compute-0 nova_compute[183177]: 2026-01-26 19:52:01.799 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:52:02 compute-0 nova_compute[183177]: 2026-01-26 19:52:02.656 183181 INFO nova.scheduler.client.report [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration 994d99b0-481a-432f-9192-8758145e3308
Jan 26 19:52:02 compute-0 nova_compute[183177]: 2026-01-26 19:52:02.656 183181 DEBUG nova.virt.libvirt.driver [None req-bbe9b882-db6e-42a8-ad68-4ccc492d00c0 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 72f54dad-db6d-4869-ab9d-ff6464876dc5] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 19:52:02 compute-0 nova_compute[183177]: 2026-01-26 19:52:02.838 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:52:02 compute-0 nova_compute[183177]: 2026-01-26 19:52:02.839 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:52:01 up  1:16,  0 user,  load average: 0.19, 0.20, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:52:02 compute-0 nova_compute[183177]: 2026-01-26 19:52:02.864 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:52:03 compute-0 nova_compute[183177]: 2026-01-26 19:52:03.372 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:52:03 compute-0 nova_compute[183177]: 2026-01-26 19:52:03.888 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:52:03 compute-0 nova_compute[183177]: 2026-01-26 19:52:03.889 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.091s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:52:03 compute-0 nova_compute[183177]: 2026-01-26 19:52:03.977 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:05 compute-0 nova_compute[183177]: 2026-01-26 19:52:05.154 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:08 compute-0 nova_compute[183177]: 2026-01-26 19:52:08.890 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:52:08 compute-0 nova_compute[183177]: 2026-01-26 19:52:08.980 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:09 compute-0 nova_compute[183177]: 2026-01-26 19:52:09.401 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:52:09 compute-0 nova_compute[183177]: 2026-01-26 19:52:09.402 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:52:10 compute-0 nova_compute[183177]: 2026-01-26 19:52:10.193 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:10 compute-0 sshd-session[209891]: Invalid user hduser from 193.32.162.151 port 49248
Jan 26 19:52:10 compute-0 sshd-session[209891]: Connection closed by invalid user hduser 193.32.162.151 port 49248 [preauth]
Jan 26 19:52:11 compute-0 podman[209893]: 2026-01-26 19:52:11.373440061 +0000 UTC m=+0.119091658 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 19:52:13 compute-0 nova_compute[183177]: 2026-01-26 19:52:13.982 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:14 compute-0 podman[209920]: 2026-01-26 19:52:14.343418865 +0000 UTC m=+0.071946209 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 26 19:52:14 compute-0 podman[209919]: 2026-01-26 19:52:14.360938176 +0000 UTC m=+0.098355869 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 19:52:15 compute-0 nova_compute[183177]: 2026-01-26 19:52:15.196 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:17 compute-0 nova_compute[183177]: 2026-01-26 19:52:17.739 183181 DEBUG nova.compute.manager [None req-b0f73456-5433-42b5-bf22-1b8749306da9 a22c940165ed49a990de4c1cf7e61838 bbc26b645f2a4d108c00608f11fdebb2 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Jan 26 19:52:17 compute-0 nova_compute[183177]: 2026-01-26 19:52:17.819 183181 DEBUG nova.compute.provider_tree [None req-b0f73456-5433-42b5-bf22-1b8749306da9 a22c940165ed49a990de4c1cf7e61838 bbc26b645f2a4d108c00608f11fdebb2 - - default default] Updating resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 generation from 25 to 27 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 19:52:18 compute-0 nova_compute[183177]: 2026-01-26 19:52:18.985 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:19 compute-0 podman[209957]: 2026-01-26 19:52:19.33141481 +0000 UTC m=+0.079475542 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:52:20 compute-0 nova_compute[183177]: 2026-01-26 19:52:20.199 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:21 compute-0 nova_compute[183177]: 2026-01-26 19:52:21.211 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:23 compute-0 nova_compute[183177]: 2026-01-26 19:52:23.987 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:24.063 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:52:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:24.063 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:52:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:24.063 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:52:25 compute-0 nova_compute[183177]: 2026-01-26 19:52:25.201 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:25 compute-0 sshd-session[209982]: Connection closed by 142.93.140.142 port 48758
Jan 26 19:52:28 compute-0 nova_compute[183177]: 2026-01-26 19:52:28.989 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:29 compute-0 podman[192499]: time="2026-01-26T19:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:52:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:52:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Jan 26 19:52:30 compute-0 nova_compute[183177]: 2026-01-26 19:52:30.203 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:31 compute-0 openstack_network_exporter[195363]: ERROR   19:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:52:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:52:31 compute-0 openstack_network_exporter[195363]: ERROR   19:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:52:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:52:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:33.582 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:f5:67 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b15450fb68e4a298e85f1a6a3da0e80', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87196033-f000-4959-a8ea-24ef132800a5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=693f04a5-b9eb-445c-95f9-2771857fca3b) old=Port_Binding(mac=['fa:16:3e:0a:f5:67'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b15450fb68e4a298e85f1a6a3da0e80', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:52:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:33.584 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 693f04a5-b9eb-445c-95f9-2771857fca3b in datapath dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 updated
Jan 26 19:52:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:33.585 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:52:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:33.586 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[0010dbf8-8e87-4087-aafc-58fcb695fcac]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:52:33 compute-0 nova_compute[183177]: 2026-01-26 19:52:33.993 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:35 compute-0 nova_compute[183177]: 2026-01-26 19:52:35.205 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:38 compute-0 nova_compute[183177]: 2026-01-26 19:52:38.995 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:40 compute-0 nova_compute[183177]: 2026-01-26 19:52:40.206 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:40 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:40.848 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:dd:8c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c858d7de-5adf-42b5-b461-a3d0a045d03d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c858d7de-5adf-42b5-b461-a3d0a045d03d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '00d6147467834874bb42a420f895fa88', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0adf7754-3826-4782-bc0d-ea10e22edab2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e16c6220-e6b5-41f2-97b9-0eeab199d59f) old=Port_Binding(mac=['fa:16:3e:77:dd:8c'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c858d7de-5adf-42b5-b461-a3d0a045d03d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c858d7de-5adf-42b5-b461-a3d0a045d03d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '00d6147467834874bb42a420f895fa88', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:52:40 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:40.849 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e16c6220-e6b5-41f2-97b9-0eeab199d59f in datapath c858d7de-5adf-42b5-b461-a3d0a045d03d updated
Jan 26 19:52:40 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:40.851 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c858d7de-5adf-42b5-b461-a3d0a045d03d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:52:40 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:40.852 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[7d5c57f6-fbb4-4a2d-b69d-125fef17fcde]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:52:42 compute-0 podman[209983]: 2026-01-26 19:52:42.373393928 +0000 UTC m=+0.126219239 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 19:52:43 compute-0 nova_compute[183177]: 2026-01-26 19:52:43.997 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:44 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:44.673 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:52:44 compute-0 nova_compute[183177]: 2026-01-26 19:52:44.674 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:44 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:44.675 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:52:45 compute-0 nova_compute[183177]: 2026-01-26 19:52:45.208 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:45 compute-0 podman[210009]: 2026-01-26 19:52:45.342286722 +0000 UTC m=+0.082673737 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:52:45 compute-0 podman[210008]: 2026-01-26 19:52:45.355231381 +0000 UTC m=+0.090933910 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 26 19:52:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:52:48.677 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:52:49 compute-0 nova_compute[183177]: 2026-01-26 19:52:49.000 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:50 compute-0 nova_compute[183177]: 2026-01-26 19:52:50.211 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:50 compute-0 podman[210048]: 2026-01-26 19:52:50.342916428 +0000 UTC m=+0.083248813 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 19:52:51 compute-0 nova_compute[183177]: 2026-01-26 19:52:51.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:52:51 compute-0 nova_compute[183177]: 2026-01-26 19:52:51.154 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 19:52:51 compute-0 nova_compute[183177]: 2026-01-26 19:52:51.734 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 19:52:52 compute-0 nova_compute[183177]: 2026-01-26 19:52:52.736 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:52:53 compute-0 nova_compute[183177]: 2026-01-26 19:52:53.058 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:52:53 compute-0 nova_compute[183177]: 2026-01-26 19:52:53.058 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:52:53 compute-0 nova_compute[183177]: 2026-01-26 19:52:53.564 183181 DEBUG nova.compute.manager [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 19:52:54 compute-0 nova_compute[183177]: 2026-01-26 19:52:54.002 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:54 compute-0 nova_compute[183177]: 2026-01-26 19:52:54.128 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:52:54 compute-0 nova_compute[183177]: 2026-01-26 19:52:54.129 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:52:54 compute-0 nova_compute[183177]: 2026-01-26 19:52:54.139 183181 DEBUG nova.virt.hardware [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 19:52:54 compute-0 nova_compute[183177]: 2026-01-26 19:52:54.140 183181 INFO nova.compute.claims [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Claim successful on node compute-0.ctlplane.example.com
Jan 26 19:52:54 compute-0 ovn_controller[95396]: 2026-01-26T19:52:54Z|00126|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 26 19:52:55 compute-0 nova_compute[183177]: 2026-01-26 19:52:55.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:52:55 compute-0 nova_compute[183177]: 2026-01-26 19:52:55.206 183181 DEBUG nova.compute.provider_tree [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:52:55 compute-0 nova_compute[183177]: 2026-01-26 19:52:55.254 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:55 compute-0 nova_compute[183177]: 2026-01-26 19:52:55.716 183181 DEBUG nova.scheduler.client.report [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:52:56 compute-0 nova_compute[183177]: 2026-01-26 19:52:56.235 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:52:56 compute-0 nova_compute[183177]: 2026-01-26 19:52:56.236 183181 DEBUG nova.compute.manager [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 19:52:56 compute-0 nova_compute[183177]: 2026-01-26 19:52:56.748 183181 DEBUG nova.compute.manager [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 19:52:56 compute-0 nova_compute[183177]: 2026-01-26 19:52:56.748 183181 DEBUG nova.network.neutron [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 19:52:56 compute-0 nova_compute[183177]: 2026-01-26 19:52:56.749 183181 WARNING neutronclient.v2_0.client [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:52:56 compute-0 nova_compute[183177]: 2026-01-26 19:52:56.749 183181 WARNING neutronclient.v2_0.client [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:52:57 compute-0 nova_compute[183177]: 2026-01-26 19:52:57.266 183181 INFO nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 19:52:57 compute-0 nova_compute[183177]: 2026-01-26 19:52:57.780 183181 DEBUG nova.compute.manager [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 19:52:57 compute-0 nova_compute[183177]: 2026-01-26 19:52:57.863 183181 DEBUG nova.network.neutron [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Successfully created port: eca2f979-3d52-4d9a-b618-e06f25abcbce _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.154 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.582 183181 DEBUG nova.network.neutron [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Successfully updated port: eca2f979-3d52-4d9a-b618-e06f25abcbce _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.681 183181 DEBUG nova.compute.manager [req-60e83469-1f87-4cc2-972e-6b67cf8769e8 req-667a92b0-72c5-4773-893e-a78f8218cf1b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received event network-changed-eca2f979-3d52-4d9a-b618-e06f25abcbce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.682 183181 DEBUG nova.compute.manager [req-60e83469-1f87-4cc2-972e-6b67cf8769e8 req-667a92b0-72c5-4773-893e-a78f8218cf1b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Refreshing instance network info cache due to event network-changed-eca2f979-3d52-4d9a-b618-e06f25abcbce. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.683 183181 DEBUG oslo_concurrency.lockutils [req-60e83469-1f87-4cc2-972e-6b67cf8769e8 req-667a92b0-72c5-4773-893e-a78f8218cf1b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-8c94ca67-fe95-4c15-a39f-d6abc83292e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.683 183181 DEBUG oslo_concurrency.lockutils [req-60e83469-1f87-4cc2-972e-6b67cf8769e8 req-667a92b0-72c5-4773-893e-a78f8218cf1b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-8c94ca67-fe95-4c15-a39f-d6abc83292e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.684 183181 DEBUG nova.network.neutron [req-60e83469-1f87-4cc2-972e-6b67cf8769e8 req-667a92b0-72c5-4773-893e-a78f8218cf1b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Refreshing network info cache for port eca2f979-3d52-4d9a-b618-e06f25abcbce _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.833 183181 DEBUG nova.compute.manager [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.836 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.837 183181 INFO nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Creating image(s)
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.838 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.838 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.840 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.841 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.848 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.850 183181 DEBUG oslo_concurrency.processutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.940 183181 DEBUG oslo_concurrency.processutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.942 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.943 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.945 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.951 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:52:58 compute-0 nova_compute[183177]: 2026-01-26 19:52:58.952 183181 DEBUG oslo_concurrency.processutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.005 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.041 183181 DEBUG oslo_concurrency.processutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.042 183181 DEBUG oslo_concurrency.processutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.085 183181 DEBUG oslo_concurrency.processutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.087 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.088 183181 DEBUG oslo_concurrency.processutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.102 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "refresh_cache-8c94ca67-fe95-4c15-a39f-d6abc83292e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.150 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.175 183181 DEBUG oslo_concurrency.processutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.176 183181 DEBUG nova.virt.disk.api [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Checking if we can resize image /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.176 183181 DEBUG oslo_concurrency.processutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.194 183181 WARNING neutronclient.v2_0.client [req-60e83469-1f87-4cc2-972e-6b67cf8769e8 req-667a92b0-72c5-4773-893e-a78f8218cf1b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.247 183181 DEBUG oslo_concurrency.processutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.248 183181 DEBUG nova.virt.disk.api [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Cannot resize image /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.249 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.250 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Ensure instance console log exists: /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.251 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.252 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.252 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.289 183181 DEBUG nova.network.neutron [req-60e83469-1f87-4cc2-972e-6b67cf8769e8 req-667a92b0-72c5-4773-893e-a78f8218cf1b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.648 183181 DEBUG nova.network.neutron [req-60e83469-1f87-4cc2-972e-6b67cf8769e8 req-667a92b0-72c5-4773-893e-a78f8218cf1b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.694 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.694 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.695 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:52:59 compute-0 nova_compute[183177]: 2026-01-26 19:52:59.695 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:52:59 compute-0 podman[192499]: time="2026-01-26T19:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:52:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:52:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Jan 26 19:53:00 compute-0 nova_compute[183177]: 2026-01-26 19:53:00.115 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:53:00 compute-0 nova_compute[183177]: 2026-01-26 19:53:00.116 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:53:00 compute-0 nova_compute[183177]: 2026-01-26 19:53:00.158 183181 DEBUG oslo_concurrency.lockutils [req-60e83469-1f87-4cc2-972e-6b67cf8769e8 req-667a92b0-72c5-4773-893e-a78f8218cf1b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-8c94ca67-fe95-4c15-a39f-d6abc83292e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:53:00 compute-0 nova_compute[183177]: 2026-01-26 19:53:00.159 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquired lock "refresh_cache-8c94ca67-fe95-4c15-a39f-d6abc83292e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:53:00 compute-0 nova_compute[183177]: 2026-01-26 19:53:00.159 183181 DEBUG nova.network.neutron [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:53:00 compute-0 nova_compute[183177]: 2026-01-26 19:53:00.161 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:53:00 compute-0 nova_compute[183177]: 2026-01-26 19:53:00.162 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5786MB free_disk=73.09829711914062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:53:00 compute-0 nova_compute[183177]: 2026-01-26 19:53:00.162 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:53:00 compute-0 nova_compute[183177]: 2026-01-26 19:53:00.163 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:53:00 compute-0 nova_compute[183177]: 2026-01-26 19:53:00.297 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:00 compute-0 nova_compute[183177]: 2026-01-26 19:53:00.798 183181 DEBUG nova.network.neutron [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.034 183181 WARNING neutronclient.v2_0.client [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.222 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 8c94ca67-fe95-4c15-a39f-d6abc83292e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.222 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.222 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:53:00 up  1:17,  0 user,  load average: 0.07, 0.16, 0.29\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_00d6147467834874bb42a420f895fa88': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.273 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.277 183181 DEBUG nova.network.neutron [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Updating instance_info_cache with network_info: [{"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:53:01 compute-0 openstack_network_exporter[195363]: ERROR   19:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:53:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:53:01 compute-0 openstack_network_exporter[195363]: ERROR   19:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:53:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.785 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Releasing lock "refresh_cache-8c94ca67-fe95-4c15-a39f-d6abc83292e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.786 183181 DEBUG nova.compute.manager [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Instance network_info: |[{"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.788 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.802 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Start _get_guest_xml network_info=[{"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.808 183181 WARNING nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.810 183181 DEBUG nova.virt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1227319324', uuid='8c94ca67-fe95-4c15-a39f-d6abc83292e6'), owner=OwnerMeta(userid='0415606853f441d3b598e0af51d1b700', username='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin', projectid='00d6147467834874bb42a420f895fa88', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769457181.8103526) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.814 183181 DEBUG nova.virt.libvirt.host [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.815 183181 DEBUG nova.virt.libvirt.host [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.819 183181 DEBUG nova.virt.libvirt.host [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.820 183181 DEBUG nova.virt.libvirt.host [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.822 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.822 183181 DEBUG nova.virt.hardware [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.823 183181 DEBUG nova.virt.hardware [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.824 183181 DEBUG nova.virt.hardware [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.824 183181 DEBUG nova.virt.hardware [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.825 183181 DEBUG nova.virt.hardware [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.825 183181 DEBUG nova.virt.hardware [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.826 183181 DEBUG nova.virt.hardware [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.826 183181 DEBUG nova.virt.hardware [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.827 183181 DEBUG nova.virt.hardware [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.827 183181 DEBUG nova.virt.hardware [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.828 183181 DEBUG nova.virt.hardware [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.834 183181 DEBUG nova.virt.libvirt.vif [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:52:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1227319324',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-122',id=16,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='00d6147467834874bb42a420f895fa88',ramdisk_id='',reservation_id='r-ka0nniqs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:52:57Z,user_data=None,user_id='0415606853f441d3b598e0af51d1b700',uuid=8c94ca67-fe95-4c15-a39f-d6abc83292e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.835 183181 DEBUG nova.network.os_vif_util [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Converting VIF {"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.836 183181 DEBUG nova.network.os_vif_util [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:4a:5c,bridge_name='br-int',has_traffic_filtering=True,id=eca2f979-3d52-4d9a-b618-e06f25abcbce,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca2f979-3d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:53:01 compute-0 nova_compute[183177]: 2026-01-26 19:53:01.837 183181 DEBUG nova.objects.instance [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8c94ca67-fe95-4c15-a39f-d6abc83292e6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.313 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.314 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.151s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.348 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] End _get_guest_xml xml=<domain type="kvm">
Jan 26 19:53:02 compute-0 nova_compute[183177]:   <uuid>8c94ca67-fe95-4c15-a39f-d6abc83292e6</uuid>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   <name>instance-00000010</name>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-1227319324</nova:name>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:53:01</nova:creationTime>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:53:02 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:53:02 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:user uuid="0415606853f441d3b598e0af51d1b700">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin</nova:user>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:project uuid="00d6147467834874bb42a420f895fa88">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351</nova:project>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         <nova:port uuid="eca2f979-3d52-4d9a-b618-e06f25abcbce">
Jan 26 19:53:02 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <system>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <entry name="serial">8c94ca67-fe95-4c15-a39f-d6abc83292e6</entry>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <entry name="uuid">8c94ca67-fe95-4c15-a39f-d6abc83292e6</entry>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     </system>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   <os>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   </os>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   <features>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   </features>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk.config"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:ab:4a:5c"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <target dev="tapeca2f979-3d"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/console.log" append="off"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <video>
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     </video>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:53:02 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:53:02 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:53:02 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:53:02 compute-0 nova_compute[183177]: </domain>
Jan 26 19:53:02 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.350 183181 DEBUG nova.compute.manager [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Preparing to wait for external event network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.350 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.350 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.350 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.351 183181 DEBUG nova.virt.libvirt.vif [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:52:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1227319324',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-122',id=16,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='00d6147467834874bb42a420f895fa88',ramdisk_id='',reservation_id='r-ka0nniqs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:52:57Z,user_data=None,user_id='0415606853f441d3b598e0af51d1b700',uuid=8c94ca67-fe95-4c15-a39f-d6abc83292e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.351 183181 DEBUG nova.network.os_vif_util [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Converting VIF {"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.351 183181 DEBUG nova.network.os_vif_util [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:4a:5c,bridge_name='br-int',has_traffic_filtering=True,id=eca2f979-3d52-4d9a-b618-e06f25abcbce,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca2f979-3d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.352 183181 DEBUG os_vif [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:4a:5c,bridge_name='br-int',has_traffic_filtering=True,id=eca2f979-3d52-4d9a-b618-e06f25abcbce,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca2f979-3d') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.352 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.352 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.353 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.353 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.353 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '67808dee-679a-5b46-9125-f0c027232538', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.354 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.356 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.359 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.360 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeca2f979-3d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.360 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapeca2f979-3d, col_values=(('qos', UUID('abf5c089-b0b6-4151-888e-92e9992bb799')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.360 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapeca2f979-3d, col_values=(('external_ids', {'iface-id': 'eca2f979-3d52-4d9a-b618-e06f25abcbce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:4a:5c', 'vm-uuid': '8c94ca67-fe95-4c15-a39f-d6abc83292e6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.362 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:02 compute-0 NetworkManager[55489]: <info>  [1769457182.3649] manager: (tapeca2f979-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.365 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.370 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:02 compute-0 nova_compute[183177]: 2026-01-26 19:53:02.371 183181 INFO os_vif [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:4a:5c,bridge_name='br-int',has_traffic_filtering=True,id=eca2f979-3d52-4d9a-b618-e06f25abcbce,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca2f979-3d')
Jan 26 19:53:03 compute-0 nova_compute[183177]: 2026-01-26 19:53:03.921 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:53:03 compute-0 nova_compute[183177]: 2026-01-26 19:53:03.921 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:53:03 compute-0 nova_compute[183177]: 2026-01-26 19:53:03.921 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] No VIF found with MAC fa:16:3e:ab:4a:5c, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 19:53:03 compute-0 nova_compute[183177]: 2026-01-26 19:53:03.922 183181 INFO nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Using config drive
Jan 26 19:53:04 compute-0 nova_compute[183177]: 2026-01-26 19:53:04.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:53:04 compute-0 nova_compute[183177]: 2026-01-26 19:53:04.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:53:04 compute-0 nova_compute[183177]: 2026-01-26 19:53:04.154 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 19:53:04 compute-0 nova_compute[183177]: 2026-01-26 19:53:04.435 183181 WARNING neutronclient.v2_0.client [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:53:05 compute-0 nova_compute[183177]: 2026-01-26 19:53:05.327 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:05 compute-0 nova_compute[183177]: 2026-01-26 19:53:05.667 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:53:06 compute-0 nova_compute[183177]: 2026-01-26 19:53:06.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:53:06 compute-0 nova_compute[183177]: 2026-01-26 19:53:06.666 183181 INFO nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Creating config drive at /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk.config
Jan 26 19:53:06 compute-0 nova_compute[183177]: 2026-01-26 19:53:06.675 183181 DEBUG oslo_concurrency.processutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp2lntbumn execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:53:06 compute-0 nova_compute[183177]: 2026-01-26 19:53:06.817 183181 DEBUG oslo_concurrency.processutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp2lntbumn" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:53:06 compute-0 kernel: tapeca2f979-3d: entered promiscuous mode
Jan 26 19:53:06 compute-0 ovn_controller[95396]: 2026-01-26T19:53:06Z|00127|binding|INFO|Claiming lport eca2f979-3d52-4d9a-b618-e06f25abcbce for this chassis.
Jan 26 19:53:06 compute-0 ovn_controller[95396]: 2026-01-26T19:53:06Z|00128|binding|INFO|eca2f979-3d52-4d9a-b618-e06f25abcbce: Claiming fa:16:3e:ab:4a:5c 10.100.0.5
Jan 26 19:53:06 compute-0 NetworkManager[55489]: <info>  [1769457186.9024] manager: (tapeca2f979-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Jan 26 19:53:06 compute-0 nova_compute[183177]: 2026-01-26 19:53:06.902 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:06 compute-0 nova_compute[183177]: 2026-01-26 19:53:06.909 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:06 compute-0 nova_compute[183177]: 2026-01-26 19:53:06.912 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:06.926 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:4a:5c 10.100.0.5'], port_security=['fa:16:3e:ab:4a:5c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '8c94ca67-fe95-4c15-a39f-d6abc83292e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '00d6147467834874bb42a420f895fa88', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3f24c591-c667-4b44-9fc4-3f4f62949186', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87196033-f000-4959-a8ea-24ef132800a5, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=eca2f979-3d52-4d9a-b618-e06f25abcbce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:53:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:06.927 104672 INFO neutron.agent.ovn.metadata.agent [-] Port eca2f979-3d52-4d9a-b618-e06f25abcbce in datapath dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 bound to our chassis
Jan 26 19:53:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:06.929 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4
Jan 26 19:53:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:06.943 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d3704c-1919-4308-9263-c2cda0a68563]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:06.944 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdcbad604-d1 in ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 19:53:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:06.949 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdcbad604-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 19:53:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:06.949 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[075842db-0e7e-438d-900a-e8ba348df89c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:06 compute-0 systemd-machined[154465]: New machine qemu-11-instance-00000010.
Jan 26 19:53:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:06.951 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[52879d93-a7c8-449b-b597-ddeae3fa1b2d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:06 compute-0 systemd-udevd[210111]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:53:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:06.964 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[ffe1b359-3f85-4675-bd5b-63f0d8768d70]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:06 compute-0 NetworkManager[55489]: <info>  [1769457186.9765] device (tapeca2f979-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:53:06 compute-0 NetworkManager[55489]: <info>  [1769457186.9773] device (tapeca2f979-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:53:06 compute-0 nova_compute[183177]: 2026-01-26 19:53:06.985 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:06.985 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[29daf527-10ea-432c-a905-5fbae8850db7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:06 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-00000010.
Jan 26 19:53:06 compute-0 ovn_controller[95396]: 2026-01-26T19:53:06Z|00129|binding|INFO|Setting lport eca2f979-3d52-4d9a-b618-e06f25abcbce ovn-installed in OVS
Jan 26 19:53:06 compute-0 ovn_controller[95396]: 2026-01-26T19:53:06Z|00130|binding|INFO|Setting lport eca2f979-3d52-4d9a-b618-e06f25abcbce up in Southbound
Jan 26 19:53:06 compute-0 nova_compute[183177]: 2026-01-26 19:53:06.990 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.027 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[964ae675-f166-4357-9002-58a1307c7c8e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.033 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[0e6cdfa0-db4f-415f-a633-ed2558419dc2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:07 compute-0 systemd-udevd[210115]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:53:07 compute-0 NetworkManager[55489]: <info>  [1769457187.0344] manager: (tapdcbad604-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/56)
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.064 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[69b0a109-b7a7-49a2-83f2-314887f12562]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.067 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[62043973-966d-4fd8-a259-d1c9744cd5cd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:07 compute-0 NetworkManager[55489]: <info>  [1769457187.0900] device (tapdcbad604-d0): carrier: link connected
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.095 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[19f9a06e-3791-46bb-acaf-b71a38c61f2c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.116 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[dd799706-aab1-4442-98c7-fb46b2a33fa0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdcbad604-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:f5:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464597, 'reachable_time': 32986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210142, 'error': None, 'target': 'ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.132 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9a85ea15-ea2c-4af4-9463-86179acebbba]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:f567'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 464597, 'tstamp': 464597}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210144, 'error': None, 'target': 'ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.155 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[5b85d29e-dda7-4fff-a3ef-c589b080dc99]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdcbad604-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:f5:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464597, 'reachable_time': 32986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210145, 'error': None, 'target': 'ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.193 183181 DEBUG nova.compute.manager [req-238ed2a8-f74d-4f17-88f0-5e6da6f315c2 req-73da2631-276e-44ff-8b91-b04996e1b84e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received event network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.194 183181 DEBUG oslo_concurrency.lockutils [req-238ed2a8-f74d-4f17-88f0-5e6da6f315c2 req-73da2631-276e-44ff-8b91-b04996e1b84e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.194 183181 DEBUG oslo_concurrency.lockutils [req-238ed2a8-f74d-4f17-88f0-5e6da6f315c2 req-73da2631-276e-44ff-8b91-b04996e1b84e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.194 183181 DEBUG oslo_concurrency.lockutils [req-238ed2a8-f74d-4f17-88f0-5e6da6f315c2 req-73da2631-276e-44ff-8b91-b04996e1b84e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.195 183181 DEBUG nova.compute.manager [req-238ed2a8-f74d-4f17-88f0-5e6da6f315c2 req-73da2631-276e-44ff-8b91-b04996e1b84e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Processing event network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.201 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6f1388a7-3139-40de-8bc7-2c9845005970]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.286 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[638924fd-b723-496f-9b90-0417680d291d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.287 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcbad604-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.288 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.288 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdcbad604-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.290 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:07 compute-0 NetworkManager[55489]: <info>  [1769457187.2912] manager: (tapdcbad604-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Jan 26 19:53:07 compute-0 kernel: tapdcbad604-d0: entered promiscuous mode
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.294 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.295 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdcbad604-d0, col_values=(('external_ids', {'iface-id': '693f04a5-b9eb-445c-95f9-2771857fca3b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.296 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:07 compute-0 ovn_controller[95396]: 2026-01-26T19:53:07Z|00131|binding|INFO|Releasing lport 693f04a5-b9eb-445c-95f9-2771857fca3b from this chassis (sb_readonly=0)
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.322 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.324 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[90d79db3-ea04-423a-b525-b9b17c7aec4b]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.325 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.325 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.325 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.325 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.326 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[531ea898-7fd6-4c6c-a4d6-a50786490b59]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.327 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.327 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2945dc47-2d7a-4ce9-a94d-8cfda4d6b976]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.328 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: global
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 19:53:07 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:07.329 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'env', 'PROCESS_TAG=haproxy-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.362 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.443 183181 DEBUG nova.compute.manager [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.455 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.458 183181 INFO nova.virt.libvirt.driver [-] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Instance spawned successfully.
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.459 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 19:53:07 compute-0 podman[210184]: 2026-01-26 19:53:07.82187885 +0000 UTC m=+0.089062469 container create 48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 19:53:07 compute-0 podman[210184]: 2026-01-26 19:53:07.778660416 +0000 UTC m=+0.045844085 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 19:53:07 compute-0 systemd[1]: Started libpod-conmon-48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3.scope.
Jan 26 19:53:07 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:53:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efbf910b4bf0a48a441a8388d52885e9258643824e4c92dff2b13887edcf0911/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 19:53:07 compute-0 podman[210184]: 2026-01-26 19:53:07.948125829 +0000 UTC m=+0.215309518 container init 48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:53:07 compute-0 podman[210184]: 2026-01-26 19:53:07.960320517 +0000 UTC m=+0.227504136 container start 48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.975 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.975 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.976 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.977 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.978 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:53:07 compute-0 nova_compute[183177]: 2026-01-26 19:53:07.978 183181 DEBUG nova.virt.libvirt.driver [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:53:07 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210199]: [NOTICE]   (210203) : New worker (210205) forked
Jan 26 19:53:07 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210199]: [NOTICE]   (210203) : Loading success.
Jan 26 19:53:08 compute-0 nova_compute[183177]: 2026-01-26 19:53:08.492 183181 INFO nova.compute.manager [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Took 9.66 seconds to spawn the instance on the hypervisor.
Jan 26 19:53:08 compute-0 nova_compute[183177]: 2026-01-26 19:53:08.493 183181 DEBUG nova.compute.manager [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 19:53:09 compute-0 nova_compute[183177]: 2026-01-26 19:53:09.026 183181 INFO nova.compute.manager [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Took 14.95 seconds to build instance.
Jan 26 19:53:09 compute-0 nova_compute[183177]: 2026-01-26 19:53:09.234 183181 DEBUG nova.compute.manager [req-d3b186c8-fa56-4955-9391-f1a51034babc req-8c09ee79-09d6-4bed-9126-e0280a54cd07 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received event network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:53:09 compute-0 nova_compute[183177]: 2026-01-26 19:53:09.234 183181 DEBUG oslo_concurrency.lockutils [req-d3b186c8-fa56-4955-9391-f1a51034babc req-8c09ee79-09d6-4bed-9126-e0280a54cd07 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:53:09 compute-0 nova_compute[183177]: 2026-01-26 19:53:09.234 183181 DEBUG oslo_concurrency.lockutils [req-d3b186c8-fa56-4955-9391-f1a51034babc req-8c09ee79-09d6-4bed-9126-e0280a54cd07 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:53:09 compute-0 nova_compute[183177]: 2026-01-26 19:53:09.234 183181 DEBUG oslo_concurrency.lockutils [req-d3b186c8-fa56-4955-9391-f1a51034babc req-8c09ee79-09d6-4bed-9126-e0280a54cd07 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:53:09 compute-0 nova_compute[183177]: 2026-01-26 19:53:09.235 183181 DEBUG nova.compute.manager [req-d3b186c8-fa56-4955-9391-f1a51034babc req-8c09ee79-09d6-4bed-9126-e0280a54cd07 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] No waiting events found dispatching network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:53:09 compute-0 nova_compute[183177]: 2026-01-26 19:53:09.235 183181 WARNING nova.compute.manager [req-d3b186c8-fa56-4955-9391-f1a51034babc req-8c09ee79-09d6-4bed-9126-e0280a54cd07 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received unexpected event network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce for instance with vm_state active and task_state None.
Jan 26 19:53:09 compute-0 nova_compute[183177]: 2026-01-26 19:53:09.530 183181 DEBUG oslo_concurrency.lockutils [None req-2230e2c6-b743-4023-a62c-c08d532943fd 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.471s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:53:10 compute-0 nova_compute[183177]: 2026-01-26 19:53:10.353 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:12 compute-0 nova_compute[183177]: 2026-01-26 19:53:12.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:53:12 compute-0 nova_compute[183177]: 2026-01-26 19:53:12.363 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:13 compute-0 podman[210214]: 2026-01-26 19:53:13.379911833 +0000 UTC m=+0.129344183 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 26 19:53:15 compute-0 nova_compute[183177]: 2026-01-26 19:53:15.388 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:16 compute-0 podman[210241]: 2026-01-26 19:53:16.328266164 +0000 UTC m=+0.068672900 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Jan 26 19:53:16 compute-0 podman[210240]: 2026-01-26 19:53:16.356682759 +0000 UTC m=+0.094443543 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal)
Jan 26 19:53:17 compute-0 nova_compute[183177]: 2026-01-26 19:53:17.365 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:19 compute-0 ovn_controller[95396]: 2026-01-26T19:53:19Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ab:4a:5c 10.100.0.5
Jan 26 19:53:19 compute-0 ovn_controller[95396]: 2026-01-26T19:53:19Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ab:4a:5c 10.100.0.5
Jan 26 19:53:20 compute-0 nova_compute[183177]: 2026-01-26 19:53:20.416 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:21 compute-0 podman[210297]: 2026-01-26 19:53:21.355363043 +0000 UTC m=+0.095354019 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 19:53:22 compute-0 nova_compute[183177]: 2026-01-26 19:53:22.368 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:24.064 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:53:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:24.064 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:53:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:24.065 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:53:25 compute-0 nova_compute[183177]: 2026-01-26 19:53:25.446 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:27 compute-0 nova_compute[183177]: 2026-01-26 19:53:27.370 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:29 compute-0 podman[192499]: time="2026-01-26T19:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:53:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:53:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2641 "" "Go-http-client/1.1"
Jan 26 19:53:30 compute-0 nova_compute[183177]: 2026-01-26 19:53:30.447 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:31 compute-0 openstack_network_exporter[195363]: ERROR   19:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:53:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:53:31 compute-0 openstack_network_exporter[195363]: ERROR   19:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:53:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:53:32 compute-0 nova_compute[183177]: 2026-01-26 19:53:32.372 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:35 compute-0 nova_compute[183177]: 2026-01-26 19:53:35.450 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:37 compute-0 sshd-session[210324]: Connection closed by authenticating user root 142.93.140.142 port 57356 [preauth]
Jan 26 19:53:37 compute-0 nova_compute[183177]: 2026-01-26 19:53:37.404 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:40 compute-0 nova_compute[183177]: 2026-01-26 19:53:40.485 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:41 compute-0 nova_compute[183177]: 2026-01-26 19:53:41.711 183181 DEBUG nova.compute.manager [None req-6bafee4a-62b2-4638-ab5e-715a5deaad6d 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Jan 26 19:53:41 compute-0 nova_compute[183177]: 2026-01-26 19:53:41.771 183181 DEBUG nova.compute.provider_tree [None req-6bafee4a-62b2-4638-ab5e-715a5deaad6d 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Updating resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 generation from 27 to 29 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 19:53:42 compute-0 nova_compute[183177]: 2026-01-26 19:53:42.406 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:44 compute-0 podman[210326]: 2026-01-26 19:53:44.390409033 +0000 UTC m=+0.134629306 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 26 19:53:45 compute-0 nova_compute[183177]: 2026-01-26 19:53:45.488 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:47 compute-0 podman[210354]: 2026-01-26 19:53:47.352833992 +0000 UTC m=+0.092399779 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 19:53:47 compute-0 podman[210353]: 2026-01-26 19:53:47.384744971 +0000 UTC m=+0.129260492 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=openstack_network_exporter)
Jan 26 19:53:47 compute-0 nova_compute[183177]: 2026-01-26 19:53:47.408 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:48 compute-0 nova_compute[183177]: 2026-01-26 19:53:48.794 183181 DEBUG nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Check if temp file /var/lib/nova/instances/tmpre8z7ihf exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 19:53:48 compute-0 nova_compute[183177]: 2026-01-26 19:53:48.800 183181 DEBUG nova.compute.manager [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpre8z7ihf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8c94ca67-fe95-4c15-a39f-d6abc83292e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 19:53:50 compute-0 nova_compute[183177]: 2026-01-26 19:53:50.531 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:52 compute-0 podman[210393]: 2026-01-26 19:53:52.326696566 +0000 UTC m=+0.073496809 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:53:52 compute-0 nova_compute[183177]: 2026-01-26 19:53:52.410 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:52 compute-0 nova_compute[183177]: 2026-01-26 19:53:52.973 183181 DEBUG oslo_concurrency.processutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:53:53 compute-0 nova_compute[183177]: 2026-01-26 19:53:53.077 183181 DEBUG oslo_concurrency.processutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:53:53 compute-0 nova_compute[183177]: 2026-01-26 19:53:53.078 183181 DEBUG oslo_concurrency.processutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:53:53 compute-0 nova_compute[183177]: 2026-01-26 19:53:53.182 183181 DEBUG oslo_concurrency.processutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:53:53 compute-0 nova_compute[183177]: 2026-01-26 19:53:53.184 183181 DEBUG nova.compute.manager [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Preparing to wait for external event network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:53:53 compute-0 nova_compute[183177]: 2026-01-26 19:53:53.185 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:53:53 compute-0 nova_compute[183177]: 2026-01-26 19:53:53.185 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:53:53 compute-0 nova_compute[183177]: 2026-01-26 19:53:53.186 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:53:54 compute-0 nova_compute[183177]: 2026-01-26 19:53:54.673 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:53:55 compute-0 nova_compute[183177]: 2026-01-26 19:53:55.535 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:56 compute-0 nova_compute[183177]: 2026-01-26 19:53:56.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:53:57 compute-0 ovn_controller[95396]: 2026-01-26T19:53:57Z|00132|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Jan 26 19:53:57 compute-0 nova_compute[183177]: 2026-01-26 19:53:57.412 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:58.879 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:53:58 compute-0 nova_compute[183177]: 2026-01-26 19:53:58.880 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:53:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:53:58.880 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:53:58 compute-0 nova_compute[183177]: 2026-01-26 19:53:58.902 183181 DEBUG nova.compute.manager [req-267d16ef-3fc5-44da-9aae-698a16a1a608 req-bd9b7903-411a-4d7a-a787-37a47e57ff9b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received event network-vif-unplugged-eca2f979-3d52-4d9a-b618-e06f25abcbce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:53:58 compute-0 nova_compute[183177]: 2026-01-26 19:53:58.903 183181 DEBUG oslo_concurrency.lockutils [req-267d16ef-3fc5-44da-9aae-698a16a1a608 req-bd9b7903-411a-4d7a-a787-37a47e57ff9b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:53:58 compute-0 nova_compute[183177]: 2026-01-26 19:53:58.903 183181 DEBUG oslo_concurrency.lockutils [req-267d16ef-3fc5-44da-9aae-698a16a1a608 req-bd9b7903-411a-4d7a-a787-37a47e57ff9b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:53:58 compute-0 nova_compute[183177]: 2026-01-26 19:53:58.904 183181 DEBUG oslo_concurrency.lockutils [req-267d16ef-3fc5-44da-9aae-698a16a1a608 req-bd9b7903-411a-4d7a-a787-37a47e57ff9b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:53:58 compute-0 nova_compute[183177]: 2026-01-26 19:53:58.904 183181 DEBUG nova.compute.manager [req-267d16ef-3fc5-44da-9aae-698a16a1a608 req-bd9b7903-411a-4d7a-a787-37a47e57ff9b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] No event matching network-vif-unplugged-eca2f979-3d52-4d9a-b618-e06f25abcbce in dict_keys([('network-vif-plugged', 'eca2f979-3d52-4d9a-b618-e06f25abcbce')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 19:53:58 compute-0 nova_compute[183177]: 2026-01-26 19:53:58.905 183181 DEBUG nova.compute.manager [req-267d16ef-3fc5-44da-9aae-698a16a1a608 req-bd9b7903-411a-4d7a-a787-37a47e57ff9b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received event network-vif-unplugged-eca2f979-3d52-4d9a-b618-e06f25abcbce for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:53:59 compute-0 nova_compute[183177]: 2026-01-26 19:53:59.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:53:59 compute-0 nova_compute[183177]: 2026-01-26 19:53:59.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:53:59 compute-0 nova_compute[183177]: 2026-01-26 19:53:59.152 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:53:59 compute-0 nova_compute[183177]: 2026-01-26 19:53:59.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:53:59 compute-0 nova_compute[183177]: 2026-01-26 19:53:59.668 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:53:59 compute-0 nova_compute[183177]: 2026-01-26 19:53:59.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:53:59 compute-0 nova_compute[183177]: 2026-01-26 19:53:59.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:53:59 compute-0 nova_compute[183177]: 2026-01-26 19:53:59.670 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:53:59 compute-0 podman[192499]: time="2026-01-26T19:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:53:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:53:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Jan 26 19:54:00 compute-0 nova_compute[183177]: 2026-01-26 19:54:00.566 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:00 compute-0 nova_compute[183177]: 2026-01-26 19:54:00.714 183181 INFO nova.compute.manager [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Took 7.53 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 26 19:54:00 compute-0 nova_compute[183177]: 2026-01-26 19:54:00.926 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.007 183181 DEBUG nova.compute.manager [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received event network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.007 183181 DEBUG oslo_concurrency.lockutils [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.008 183181 DEBUG oslo_concurrency.lockutils [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.008 183181 DEBUG oslo_concurrency.lockutils [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.008 183181 DEBUG nova.compute.manager [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Processing event network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.009 183181 DEBUG nova.compute.manager [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received event network-changed-eca2f979-3d52-4d9a-b618-e06f25abcbce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.009 183181 DEBUG nova.compute.manager [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Refreshing instance network info cache due to event network-changed-eca2f979-3d52-4d9a-b618-e06f25abcbce. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.009 183181 DEBUG oslo_concurrency.lockutils [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-8c94ca67-fe95-4c15-a39f-d6abc83292e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.010 183181 DEBUG oslo_concurrency.lockutils [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-8c94ca67-fe95-4c15-a39f-d6abc83292e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.010 183181 DEBUG nova.network.neutron [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Refreshing network info cache for port eca2f979-3d52-4d9a-b618-e06f25abcbce _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.012 183181 DEBUG nova.compute.manager [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.021 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.022 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.085 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.271 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.273 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.311 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.312 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5572MB free_disk=73.06946182250977GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.313 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.313 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:01 compute-0 openstack_network_exporter[195363]: ERROR   19:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:54:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:54:01 compute-0 openstack_network_exporter[195363]: ERROR   19:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:54:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.519 183181 WARNING neutronclient.v2_0.client [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:54:01 compute-0 nova_compute[183177]: 2026-01-26 19:54:01.532 183181 DEBUG nova.compute.manager [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpre8z7ihf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8c94ca67-fe95-4c15-a39f-d6abc83292e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(ef5a2da2-62ac-4f57-bbb5-cde71e4ba15f),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.050 183181 DEBUG nova.objects.instance [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid 8c94ca67-fe95-4c15-a39f-d6abc83292e6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.052 183181 DEBUG nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.056 183181 DEBUG nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.056 183181 DEBUG nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.174 183181 WARNING neutronclient.v2_0.client [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.336 183181 INFO nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Updating resource usage from migration ef5a2da2-62ac-4f57-bbb5-cde71e4ba15f
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.414 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Migration ef5a2da2-62ac-4f57-bbb5-cde71e4ba15f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.415 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.415 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:54:01 up  1:18,  0 user,  load average: 0.26, 0.22, 0.31\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_00d6147467834874bb42a420f895fa88': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.419 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.493 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.559 183181 DEBUG nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.560 183181 DEBUG nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.565 183181 DEBUG nova.virt.libvirt.vif [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:52:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1227319324',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-122',id=16,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:53:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='00d6147467834874bb42a420f895fa88',ramdisk_id='',reservation_id='r-ka0nniqs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:53:08Z,user_data=None,user_id='0415606853f441d3b598e0af51d1b700',uuid=8c94ca67-fe95-4c15-a39f-d6abc83292e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.566 183181 DEBUG nova.network.os_vif_util [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.566 183181 DEBUG nova.network.os_vif_util [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:4a:5c,bridge_name='br-int',has_traffic_filtering=True,id=eca2f979-3d52-4d9a-b618-e06f25abcbce,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca2f979-3d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.567 183181 DEBUG nova.virt.libvirt.migration [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <mac address="fa:16:3e:ab:4a:5c"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <model type="virtio"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <mtu size="1442"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <target dev="tapeca2f979-3d"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]: </interface>
Jan 26 19:54:02 compute-0 nova_compute[183177]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.568 183181 DEBUG nova.virt.libvirt.migration [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <name>instance-00000010</name>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <uuid>8c94ca67-fe95-4c15-a39f-d6abc83292e6</uuid>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-1227319324</nova:name>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:53:01</nova:creationTime>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:54:02 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:54:02 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:user uuid="0415606853f441d3b598e0af51d1b700">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin</nova:user>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:project uuid="00d6147467834874bb42a420f895fa88">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351</nova:project>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:port uuid="eca2f979-3d52-4d9a-b618-e06f25abcbce">
Jan 26 19:54:02 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <system>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="serial">8c94ca67-fe95-4c15-a39f-d6abc83292e6</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="uuid">8c94ca67-fe95-4c15-a39f-d6abc83292e6</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </system>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <os>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </os>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <features>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </features>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk.config"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:ab:4a:5c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapeca2f979-3d"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/console.log" append="off"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </target>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/console.log" append="off"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </console>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </input>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <video>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </video>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]: </domain>
Jan 26 19:54:02 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.569 183181 DEBUG nova.virt.libvirt.migration [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <name>instance-00000010</name>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <uuid>8c94ca67-fe95-4c15-a39f-d6abc83292e6</uuid>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-1227319324</nova:name>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:53:01</nova:creationTime>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:54:02 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:54:02 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:user uuid="0415606853f441d3b598e0af51d1b700">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin</nova:user>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:project uuid="00d6147467834874bb42a420f895fa88">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351</nova:project>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:port uuid="eca2f979-3d52-4d9a-b618-e06f25abcbce">
Jan 26 19:54:02 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <system>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="serial">8c94ca67-fe95-4c15-a39f-d6abc83292e6</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="uuid">8c94ca67-fe95-4c15-a39f-d6abc83292e6</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </system>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <os>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </os>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <features>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </features>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk.config"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:ab:4a:5c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapeca2f979-3d"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/console.log" append="off"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </target>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/console.log" append="off"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </console>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </input>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <video>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </video>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]: </domain>
Jan 26 19:54:02 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.569 183181 DEBUG nova.virt.libvirt.migration [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <name>instance-00000010</name>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <uuid>8c94ca67-fe95-4c15-a39f-d6abc83292e6</uuid>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-1227319324</nova:name>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:53:01</nova:creationTime>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:54:02 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:54:02 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:user uuid="0415606853f441d3b598e0af51d1b700">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin</nova:user>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:project uuid="00d6147467834874bb42a420f895fa88">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351</nova:project>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <nova:port uuid="eca2f979-3d52-4d9a-b618-e06f25abcbce">
Jan 26 19:54:02 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <system>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="serial">8c94ca67-fe95-4c15-a39f-d6abc83292e6</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="uuid">8c94ca67-fe95-4c15-a39f-d6abc83292e6</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </system>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <os>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </os>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <features>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </features>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/disk.config"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:ab:4a:5c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapeca2f979-3d"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/console.log" append="off"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:54:02 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       </target>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6/console.log" append="off"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </console>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </input>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <video>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </video>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:54:02 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:54:02 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:54:02 compute-0 nova_compute[183177]: </domain>
Jan 26 19:54:02 compute-0 nova_compute[183177]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.569 183181 DEBUG nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.709 183181 DEBUG nova.network.neutron [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Updated VIF entry in instance network info cache for port eca2f979-3d52-4d9a-b618-e06f25abcbce. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 19:54:02 compute-0 nova_compute[183177]: 2026-01-26 19:54:02.710 183181 DEBUG nova.network.neutron [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Updating instance_info_cache with network_info: [{"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:54:03 compute-0 nova_compute[183177]: 2026-01-26 19:54:03.002 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:54:03 compute-0 nova_compute[183177]: 2026-01-26 19:54:03.063 183181 DEBUG nova.virt.libvirt.migration [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 19:54:03 compute-0 nova_compute[183177]: 2026-01-26 19:54:03.063 183181 INFO nova.virt.libvirt.migration [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 19:54:03 compute-0 nova_compute[183177]: 2026-01-26 19:54:03.218 183181 DEBUG oslo_concurrency.lockutils [req-a45ef54b-f3de-496d-8b09-e39a8da306ee req-d5a44519-c3a7-46fb-b1b6-a32b047fd5d8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-8c94ca67-fe95-4c15-a39f-d6abc83292e6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:54:03 compute-0 nova_compute[183177]: 2026-01-26 19:54:03.515 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:54:03 compute-0 nova_compute[183177]: 2026-01-26 19:54:03.516 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.203s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:04 compute-0 nova_compute[183177]: 2026-01-26 19:54:04.093 183181 INFO nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 19:54:04 compute-0 nova_compute[183177]: 2026-01-26 19:54:04.598 183181 DEBUG nova.virt.libvirt.migration [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 19:54:04 compute-0 nova_compute[183177]: 2026-01-26 19:54:04.598 183181 DEBUG nova.virt.libvirt.migration [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 19:54:04 compute-0 kernel: tapeca2f979-3d (unregistering): left promiscuous mode
Jan 26 19:54:04 compute-0 NetworkManager[55489]: <info>  [1769457244.6720] device (tapeca2f979-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:54:04 compute-0 ovn_controller[95396]: 2026-01-26T19:54:04Z|00133|binding|INFO|Releasing lport eca2f979-3d52-4d9a-b618-e06f25abcbce from this chassis (sb_readonly=0)
Jan 26 19:54:04 compute-0 nova_compute[183177]: 2026-01-26 19:54:04.680 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:04 compute-0 ovn_controller[95396]: 2026-01-26T19:54:04Z|00134|binding|INFO|Setting lport eca2f979-3d52-4d9a-b618-e06f25abcbce down in Southbound
Jan 26 19:54:04 compute-0 ovn_controller[95396]: 2026-01-26T19:54:04Z|00135|binding|INFO|Removing iface tapeca2f979-3d ovn-installed in OVS
Jan 26 19:54:04 compute-0 nova_compute[183177]: 2026-01-26 19:54:04.684 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:04.690 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:4a:5c 10.100.0.5'], port_security=['fa:16:3e:ab:4a:5c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c86a094a-ada8-46ad-9c23-c857e9a7b834'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '8c94ca67-fe95-4c15-a39f-d6abc83292e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '00d6147467834874bb42a420f895fa88', 'neutron:revision_number': '10', 'neutron:security_group_ids': '3f24c591-c667-4b44-9fc4-3f4f62949186', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87196033-f000-4959-a8ea-24ef132800a5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=eca2f979-3d52-4d9a-b618-e06f25abcbce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:54:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:04.692 104672 INFO neutron.agent.ovn.metadata.agent [-] Port eca2f979-3d52-4d9a-b618-e06f25abcbce in datapath dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 unbound from our chassis
Jan 26 19:54:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:04.694 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:54:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:04.697 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d35fedb6-9116-4600-bc23-cf9c1c3b817f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:54:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:04.699 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 namespace which is not needed anymore
Jan 26 19:54:04 compute-0 nova_compute[183177]: 2026-01-26 19:54:04.706 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:04 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000010.scope: Deactivated successfully.
Jan 26 19:54:04 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000010.scope: Consumed 15.560s CPU time.
Jan 26 19:54:04 compute-0 systemd-machined[154465]: Machine qemu-11-instance-00000010 terminated.
Jan 26 19:54:04 compute-0 podman[210467]: 2026-01-26 19:54:04.849024154 +0000 UTC m=+0.039578666 container kill 48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Jan 26 19:54:04 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210199]: [NOTICE]   (210203) : haproxy version is 3.0.5-8e879a5
Jan 26 19:54:04 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210199]: [NOTICE]   (210203) : path to executable is /usr/sbin/haproxy
Jan 26 19:54:04 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210199]: [WARNING]  (210203) : Exiting Master process...
Jan 26 19:54:04 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210199]: [ALERT]    (210203) : Current worker (210205) exited with code 143 (Terminated)
Jan 26 19:54:04 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210199]: [WARNING]  (210203) : All workers exited. Exiting... (0)
Jan 26 19:54:04 compute-0 systemd[1]: libpod-48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3.scope: Deactivated successfully.
Jan 26 19:54:04 compute-0 podman[210483]: 2026-01-26 19:54:04.918104474 +0000 UTC m=+0.047004246 container died 48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 19:54:04 compute-0 nova_compute[183177]: 2026-01-26 19:54:04.938 183181 DEBUG nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 19:54:04 compute-0 nova_compute[183177]: 2026-01-26 19:54:04.939 183181 DEBUG nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 19:54:04 compute-0 nova_compute[183177]: 2026-01-26 19:54:04.939 183181 DEBUG nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 19:54:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3-userdata-shm.mount: Deactivated successfully.
Jan 26 19:54:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-efbf910b4bf0a48a441a8388d52885e9258643824e4c92dff2b13887edcf0911-merged.mount: Deactivated successfully.
Jan 26 19:54:04 compute-0 podman[210483]: 2026-01-26 19:54:04.964659958 +0000 UTC m=+0.093559700 container cleanup 48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:54:04 compute-0 systemd[1]: libpod-conmon-48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3.scope: Deactivated successfully.
Jan 26 19:54:04 compute-0 podman[210488]: 2026-01-26 19:54:04.988770387 +0000 UTC m=+0.090279452 container remove 48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Jan 26 19:54:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:04.997 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6d487e9a-7b87-4bd8-bc33-ae46debe8142]: (4, ("Mon Jan 26 07:54:04 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 (48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3)\n48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3\nMon Jan 26 07:54:04 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 (48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3)\n48d7ce5f2d6506f1b133a9ae3e5bef51d7698f4d9eff2563dac52c2eceed7be3\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:54:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:04.999 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3345cef7-a039-440d-b042-a99997da1baf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:54:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:05.000 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:54:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:05.001 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d228ce8a-cbd5-4108-bbb6-7e271b5b2083]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:54:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:05.003 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcbad604-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:54:05 compute-0 nova_compute[183177]: 2026-01-26 19:54:05.006 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:05 compute-0 kernel: tapdcbad604-d0: left promiscuous mode
Jan 26 19:54:05 compute-0 nova_compute[183177]: 2026-01-26 19:54:05.035 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:05.040 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[65750a47-770a-46b6-858b-5b93f38f9b55]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:54:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:05.055 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2db481d4-8956-4452-8c74-61a07ff53777]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:54:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:05.056 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2480f945-f7b7-4f47-959b-faa142ba4fd9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:54:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:05.083 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8c5a0248-bfeb-4b5c-bd35-4425d9a9c029]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464591, 'reachable_time': 16392, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210532, 'error': None, 'target': 'ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:54:05 compute-0 systemd[1]: run-netns-ovnmeta\x2ddcbad604\x2ddc9d\x2d41d5\x2da1f5\x2d0fd2f7c5a4b4.mount: Deactivated successfully.
Jan 26 19:54:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:05.089 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 19:54:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:05.091 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[de23fe3d-4cbc-4334-8489-0e480935f523]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:54:05 compute-0 nova_compute[183177]: 2026-01-26 19:54:05.101 183181 DEBUG nova.virt.libvirt.guest [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '8c94ca67-fe95-4c15-a39f-d6abc83292e6' (instance-00000010) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 19:54:05 compute-0 nova_compute[183177]: 2026-01-26 19:54:05.101 183181 INFO nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Migration operation has completed
Jan 26 19:54:05 compute-0 nova_compute[183177]: 2026-01-26 19:54:05.102 183181 INFO nova.compute.manager [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] _post_live_migration() is started..
Jan 26 19:54:05 compute-0 nova_compute[183177]: 2026-01-26 19:54:05.118 183181 WARNING neutronclient.v2_0.client [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:54:05 compute-0 nova_compute[183177]: 2026-01-26 19:54:05.119 183181 WARNING neutronclient.v2_0.client [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:54:05 compute-0 nova_compute[183177]: 2026-01-26 19:54:05.568 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:05.883 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:54:06 compute-0 nova_compute[183177]: 2026-01-26 19:54:06.968 183181 DEBUG nova.compute.manager [req-c24eeb94-6081-4365-8549-814760368bc5 req-f6cfdb9f-e60c-4181-a1aa-01ae06849cff 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received event network-vif-unplugged-eca2f979-3d52-4d9a-b618-e06f25abcbce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:54:06 compute-0 nova_compute[183177]: 2026-01-26 19:54:06.968 183181 DEBUG oslo_concurrency.lockutils [req-c24eeb94-6081-4365-8549-814760368bc5 req-f6cfdb9f-e60c-4181-a1aa-01ae06849cff 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:06 compute-0 nova_compute[183177]: 2026-01-26 19:54:06.969 183181 DEBUG oslo_concurrency.lockutils [req-c24eeb94-6081-4365-8549-814760368bc5 req-f6cfdb9f-e60c-4181-a1aa-01ae06849cff 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:06 compute-0 nova_compute[183177]: 2026-01-26 19:54:06.969 183181 DEBUG oslo_concurrency.lockutils [req-c24eeb94-6081-4365-8549-814760368bc5 req-f6cfdb9f-e60c-4181-a1aa-01ae06849cff 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:06 compute-0 nova_compute[183177]: 2026-01-26 19:54:06.970 183181 DEBUG nova.compute.manager [req-c24eeb94-6081-4365-8549-814760368bc5 req-f6cfdb9f-e60c-4181-a1aa-01ae06849cff 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] No waiting events found dispatching network-vif-unplugged-eca2f979-3d52-4d9a-b618-e06f25abcbce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:54:06 compute-0 nova_compute[183177]: 2026-01-26 19:54:06.970 183181 DEBUG nova.compute.manager [req-c24eeb94-6081-4365-8549-814760368bc5 req-f6cfdb9f-e60c-4181-a1aa-01ae06849cff 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received event network-vif-unplugged-eca2f979-3d52-4d9a-b618-e06f25abcbce for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.220 183181 DEBUG nova.network.neutron [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Activated binding for port eca2f979-3d52-4d9a-b618-e06f25abcbce and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.221 183181 DEBUG nova.compute.manager [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.222 183181 DEBUG nova.virt.libvirt.vif [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:52:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1227319324',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-122',id=16,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:53:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='00d6147467834874bb42a420f895fa88',ramdisk_id='',reservation_id='r-ka0nniqs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:53:44Z,user_data=None,user_id='0415606853f441d3b598e0af51d1b700',uuid=8c94ca67-fe95-4c15-a39f-d6abc83292e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.222 183181 DEBUG nova.network.os_vif_util [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "address": "fa:16:3e:ab:4a:5c", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeca2f979-3d", "ovs_interfaceid": "eca2f979-3d52-4d9a-b618-e06f25abcbce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.223 183181 DEBUG nova.network.os_vif_util [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:4a:5c,bridge_name='br-int',has_traffic_filtering=True,id=eca2f979-3d52-4d9a-b618-e06f25abcbce,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca2f979-3d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.223 183181 DEBUG os_vif [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:4a:5c,bridge_name='br-int',has_traffic_filtering=True,id=eca2f979-3d52-4d9a-b618-e06f25abcbce,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca2f979-3d') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.225 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.226 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeca2f979-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.227 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.229 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.231 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.231 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=abf5c089-b0b6-4151-888e-92e9992bb799) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.232 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.233 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.237 183181 INFO os_vif [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:4a:5c,bridge_name='br-int',has_traffic_filtering=True,id=eca2f979-3d52-4d9a-b618-e06f25abcbce,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeca2f979-3d')
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.238 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.238 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.238 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.239 183181 DEBUG nova.compute.manager [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.239 183181 INFO nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Deleting instance files /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6_del
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.240 183181 INFO nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Deletion of /var/lib/nova/instances/8c94ca67-fe95-4c15-a39f-d6abc83292e6_del complete
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.517 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.518 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:54:07 compute-0 nova_compute[183177]: 2026-01-26 19:54:07.518 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.037 183181 DEBUG nova.compute.manager [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received event network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.038 183181 DEBUG oslo_concurrency.lockutils [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.038 183181 DEBUG oslo_concurrency.lockutils [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.039 183181 DEBUG oslo_concurrency.lockutils [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.039 183181 DEBUG nova.compute.manager [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] No waiting events found dispatching network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.039 183181 WARNING nova.compute.manager [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received unexpected event network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce for instance with vm_state active and task_state migrating.
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.040 183181 DEBUG nova.compute.manager [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received event network-vif-unplugged-eca2f979-3d52-4d9a-b618-e06f25abcbce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.040 183181 DEBUG oslo_concurrency.lockutils [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.041 183181 DEBUG oslo_concurrency.lockutils [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.042 183181 DEBUG oslo_concurrency.lockutils [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.042 183181 DEBUG nova.compute.manager [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] No waiting events found dispatching network-vif-unplugged-eca2f979-3d52-4d9a-b618-e06f25abcbce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.042 183181 DEBUG nova.compute.manager [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received event network-vif-unplugged-eca2f979-3d52-4d9a-b618-e06f25abcbce for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.043 183181 DEBUG nova.compute.manager [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received event network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.043 183181 DEBUG oslo_concurrency.lockutils [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.044 183181 DEBUG oslo_concurrency.lockutils [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.044 183181 DEBUG oslo_concurrency.lockutils [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.044 183181 DEBUG nova.compute.manager [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] No waiting events found dispatching network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.045 183181 WARNING nova.compute.manager [req-4c590bd7-f4ab-44b4-90fc-0c201fd50230 req-9a7eed8b-9e3d-49fa-b663-b36947e87259 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Received unexpected event network-vif-plugged-eca2f979-3d52-4d9a-b618-e06f25abcbce for instance with vm_state active and task_state migrating.
Jan 26 19:54:09 compute-0 nova_compute[183177]: 2026-01-26 19:54:09.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:54:10 compute-0 nova_compute[183177]: 2026-01-26 19:54:10.621 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:12 compute-0 nova_compute[183177]: 2026-01-26 19:54:12.234 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:15 compute-0 podman[210533]: 2026-01-26 19:54:15.480017492 +0000 UTC m=+0.217963359 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller)
Jan 26 19:54:15 compute-0 nova_compute[183177]: 2026-01-26 19:54:15.624 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:17 compute-0 nova_compute[183177]: 2026-01-26 19:54:17.237 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:17 compute-0 nova_compute[183177]: 2026-01-26 19:54:17.286 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:17 compute-0 nova_compute[183177]: 2026-01-26 19:54:17.287 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:17 compute-0 nova_compute[183177]: 2026-01-26 19:54:17.287 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "8c94ca67-fe95-4c15-a39f-d6abc83292e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:17 compute-0 nova_compute[183177]: 2026-01-26 19:54:17.806 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:17 compute-0 nova_compute[183177]: 2026-01-26 19:54:17.807 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:17 compute-0 nova_compute[183177]: 2026-01-26 19:54:17.808 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:17 compute-0 nova_compute[183177]: 2026-01-26 19:54:17.808 183181 DEBUG nova.compute.resource_tracker [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:54:17 compute-0 podman[210563]: 2026-01-26 19:54:17.966410235 +0000 UTC m=+0.090313383 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 26 19:54:17 compute-0 podman[210562]: 2026-01-26 19:54:17.987917814 +0000 UTC m=+0.117538156 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, config_id=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 19:54:18 compute-0 nova_compute[183177]: 2026-01-26 19:54:18.056 183181 WARNING nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:54:18 compute-0 nova_compute[183177]: 2026-01-26 19:54:18.058 183181 DEBUG oslo_concurrency.processutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:54:18 compute-0 nova_compute[183177]: 2026-01-26 19:54:18.105 183181 DEBUG oslo_concurrency.processutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:54:18 compute-0 nova_compute[183177]: 2026-01-26 19:54:18.106 183181 DEBUG nova.compute.resource_tracker [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5698MB free_disk=73.0984992980957GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:54:18 compute-0 nova_compute[183177]: 2026-01-26 19:54:18.106 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:18 compute-0 nova_compute[183177]: 2026-01-26 19:54:18.106 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:19 compute-0 nova_compute[183177]: 2026-01-26 19:54:19.133 183181 DEBUG nova.compute.resource_tracker [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance 8c94ca67-fe95-4c15-a39f-d6abc83292e6 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 19:54:19 compute-0 nova_compute[183177]: 2026-01-26 19:54:19.643 183181 DEBUG nova.compute.resource_tracker [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 19:54:19 compute-0 nova_compute[183177]: 2026-01-26 19:54:19.684 183181 DEBUG nova.compute.resource_tracker [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration ef5a2da2-62ac-4f57-bbb5-cde71e4ba15f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 19:54:19 compute-0 nova_compute[183177]: 2026-01-26 19:54:19.685 183181 DEBUG nova.compute.resource_tracker [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:54:19 compute-0 nova_compute[183177]: 2026-01-26 19:54:19.685 183181 DEBUG nova.compute.resource_tracker [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:54:18 up  1:18,  0 user,  load average: 0.31, 0.23, 0.31\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:54:19 compute-0 nova_compute[183177]: 2026-01-26 19:54:19.730 183181 DEBUG nova.compute.provider_tree [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:54:20 compute-0 nova_compute[183177]: 2026-01-26 19:54:20.237 183181 DEBUG nova.scheduler.client.report [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:54:20 compute-0 sshd-session[210603]: Connection closed by authenticating user root 142.93.140.142 port 37060 [preauth]
Jan 26 19:54:20 compute-0 nova_compute[183177]: 2026-01-26 19:54:20.628 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:20 compute-0 nova_compute[183177]: 2026-01-26 19:54:20.755 183181 DEBUG nova.compute.resource_tracker [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:54:20 compute-0 nova_compute[183177]: 2026-01-26 19:54:20.756 183181 DEBUG oslo_concurrency.lockutils [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.650s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:20 compute-0 nova_compute[183177]: 2026-01-26 19:54:20.790 183181 INFO nova.compute.manager [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 26 19:54:21 compute-0 nova_compute[183177]: 2026-01-26 19:54:21.983 183181 INFO nova.scheduler.client.report [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration ef5a2da2-62ac-4f57-bbb5-cde71e4ba15f
Jan 26 19:54:21 compute-0 nova_compute[183177]: 2026-01-26 19:54:21.984 183181 DEBUG nova.virt.libvirt.driver [None req-d0f4fcf0-fe05-4f5a-bf8a-87db0f4e289f 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 8c94ca67-fe95-4c15-a39f-d6abc83292e6] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 19:54:22 compute-0 nova_compute[183177]: 2026-01-26 19:54:22.241 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:23 compute-0 podman[210607]: 2026-01-26 19:54:23.366112857 +0000 UTC m=+0.089133181 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:54:23 compute-0 sshd-session[210605]: Invalid user hadoop from 193.32.162.151 port 54830
Jan 26 19:54:23 compute-0 sshd-session[210605]: Connection closed by invalid user hadoop 193.32.162.151 port 54830 [preauth]
Jan 26 19:54:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:24.066 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:24.067 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:54:24.067 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:25 compute-0 nova_compute[183177]: 2026-01-26 19:54:25.665 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:26 compute-0 nova_compute[183177]: 2026-01-26 19:54:26.219 183181 DEBUG nova.compute.manager [None req-15f3450e-08fc-4e21-a7c8-997b57ba9593 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Jan 26 19:54:26 compute-0 nova_compute[183177]: 2026-01-26 19:54:26.289 183181 DEBUG nova.compute.provider_tree [None req-15f3450e-08fc-4e21-a7c8-997b57ba9593 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Updating resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 generation from 29 to 32 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 26 19:54:27 compute-0 nova_compute[183177]: 2026-01-26 19:54:27.246 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:29 compute-0 podman[192499]: time="2026-01-26T19:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:54:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:54:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Jan 26 19:54:30 compute-0 nova_compute[183177]: 2026-01-26 19:54:30.704 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:31 compute-0 openstack_network_exporter[195363]: ERROR   19:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:54:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:54:31 compute-0 openstack_network_exporter[195363]: ERROR   19:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:54:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:54:32 compute-0 nova_compute[183177]: 2026-01-26 19:54:32.249 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:35 compute-0 nova_compute[183177]: 2026-01-26 19:54:35.707 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:37 compute-0 nova_compute[183177]: 2026-01-26 19:54:37.252 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:40 compute-0 sshd-session[210632]: Connection closed by 188.166.116.149 port 51376
Jan 26 19:54:40 compute-0 nova_compute[183177]: 2026-01-26 19:54:40.746 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:42 compute-0 nova_compute[183177]: 2026-01-26 19:54:42.255 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:45 compute-0 nova_compute[183177]: 2026-01-26 19:54:45.781 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:46 compute-0 podman[210633]: 2026-01-26 19:54:46.415859883 +0000 UTC m=+0.149457305 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260120)
Jan 26 19:54:47 compute-0 nova_compute[183177]: 2026-01-26 19:54:47.258 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:48 compute-0 podman[210660]: 2026-01-26 19:54:48.329162196 +0000 UTC m=+0.060023361 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:54:48 compute-0 podman[210659]: 2026-01-26 19:54:48.351789777 +0000 UTC m=+0.083996919 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 19:54:48 compute-0 nova_compute[183177]: 2026-01-26 19:54:48.486 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:48 compute-0 nova_compute[183177]: 2026-01-26 19:54:48.487 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:48 compute-0 nova_compute[183177]: 2026-01-26 19:54:48.993 183181 DEBUG nova.compute.manager [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 19:54:49 compute-0 nova_compute[183177]: 2026-01-26 19:54:49.556 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:49 compute-0 nova_compute[183177]: 2026-01-26 19:54:49.557 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:49 compute-0 nova_compute[183177]: 2026-01-26 19:54:49.566 183181 DEBUG nova.virt.hardware [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 19:54:49 compute-0 nova_compute[183177]: 2026-01-26 19:54:49.567 183181 INFO nova.compute.claims [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Claim successful on node compute-0.ctlplane.example.com
Jan 26 19:54:50 compute-0 nova_compute[183177]: 2026-01-26 19:54:50.642 183181 DEBUG nova.compute.provider_tree [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:54:50 compute-0 nova_compute[183177]: 2026-01-26 19:54:50.825 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:51 compute-0 nova_compute[183177]: 2026-01-26 19:54:51.155 183181 DEBUG nova.scheduler.client.report [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:54:51 compute-0 nova_compute[183177]: 2026-01-26 19:54:51.669 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.112s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:51 compute-0 nova_compute[183177]: 2026-01-26 19:54:51.670 183181 DEBUG nova.compute.manager [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 19:54:52 compute-0 nova_compute[183177]: 2026-01-26 19:54:52.188 183181 DEBUG nova.compute.manager [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 19:54:52 compute-0 nova_compute[183177]: 2026-01-26 19:54:52.188 183181 DEBUG nova.network.neutron [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 19:54:52 compute-0 nova_compute[183177]: 2026-01-26 19:54:52.189 183181 WARNING neutronclient.v2_0.client [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:54:52 compute-0 nova_compute[183177]: 2026-01-26 19:54:52.190 183181 WARNING neutronclient.v2_0.client [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:54:52 compute-0 nova_compute[183177]: 2026-01-26 19:54:52.262 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:52 compute-0 nova_compute[183177]: 2026-01-26 19:54:52.700 183181 INFO nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 19:54:52 compute-0 nova_compute[183177]: 2026-01-26 19:54:52.815 183181 DEBUG nova.network.neutron [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Successfully created port: e1e6b646-6f32-4142-8c54-471be13d3049 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 19:54:53 compute-0 nova_compute[183177]: 2026-01-26 19:54:53.211 183181 DEBUG nova.compute.manager [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.101 183181 DEBUG nova.network.neutron [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Successfully updated port: e1e6b646-6f32-4142-8c54-471be13d3049 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.175 183181 DEBUG nova.compute.manager [req-49e54dbd-7cde-4e63-a39a-f6f5f4e723f7 req-2af8d778-945f-49dc-a6d9-350d452281e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-changed-e1e6b646-6f32-4142-8c54-471be13d3049 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.176 183181 DEBUG nova.compute.manager [req-49e54dbd-7cde-4e63-a39a-f6f5f4e723f7 req-2af8d778-945f-49dc-a6d9-350d452281e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Refreshing instance network info cache due to event network-changed-e1e6b646-6f32-4142-8c54-471be13d3049. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.176 183181 DEBUG oslo_concurrency.lockutils [req-49e54dbd-7cde-4e63-a39a-f6f5f4e723f7 req-2af8d778-945f-49dc-a6d9-350d452281e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-e16c3193-bfb0-4095-98ff-c8bacc109e97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.176 183181 DEBUG oslo_concurrency.lockutils [req-49e54dbd-7cde-4e63-a39a-f6f5f4e723f7 req-2af8d778-945f-49dc-a6d9-350d452281e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-e16c3193-bfb0-4095-98ff-c8bacc109e97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.177 183181 DEBUG nova.network.neutron [req-49e54dbd-7cde-4e63-a39a-f6f5f4e723f7 req-2af8d778-945f-49dc-a6d9-350d452281e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Refreshing network info cache for port e1e6b646-6f32-4142-8c54-471be13d3049 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.234 183181 DEBUG nova.compute.manager [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.236 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.237 183181 INFO nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Creating image(s)
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.238 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.238 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.239 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.240 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.248 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.255 183181 DEBUG oslo_concurrency.processutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:54:54 compute-0 podman[210700]: 2026-01-26 19:54:54.365656023 +0000 UTC m=+0.097396009 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.365 183181 DEBUG oslo_concurrency.processutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.110s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.368 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.369 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.370 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.374 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.375 183181 DEBUG oslo_concurrency.processutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.440 183181 DEBUG oslo_concurrency.processutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.442 183181 DEBUG oslo_concurrency.processutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.491 183181 DEBUG oslo_concurrency.processutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.492 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.493 183181 DEBUG oslo_concurrency.processutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.552 183181 DEBUG oslo_concurrency.processutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.554 183181 DEBUG nova.virt.disk.api [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Checking if we can resize image /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.554 183181 DEBUG oslo_concurrency.processutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.609 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "refresh_cache-e16c3193-bfb0-4095-98ff-c8bacc109e97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.615 183181 DEBUG oslo_concurrency.processutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.615 183181 DEBUG nova.virt.disk.api [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Cannot resize image /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.616 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.617 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Ensure instance console log exists: /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.617 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.618 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.619 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.695 183181 WARNING neutronclient.v2_0.client [req-49e54dbd-7cde-4e63-a39a-f6f5f4e723f7 req-2af8d778-945f-49dc-a6d9-350d452281e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.795 183181 DEBUG nova.network.neutron [req-49e54dbd-7cde-4e63-a39a-f6f5f4e723f7 req-2af8d778-945f-49dc-a6d9-350d452281e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:54:54 compute-0 nova_compute[183177]: 2026-01-26 19:54:54.977 183181 DEBUG nova.network.neutron [req-49e54dbd-7cde-4e63-a39a-f6f5f4e723f7 req-2af8d778-945f-49dc-a6d9-350d452281e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:54:55 compute-0 nova_compute[183177]: 2026-01-26 19:54:55.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:54:55 compute-0 nova_compute[183177]: 2026-01-26 19:54:55.485 183181 DEBUG oslo_concurrency.lockutils [req-49e54dbd-7cde-4e63-a39a-f6f5f4e723f7 req-2af8d778-945f-49dc-a6d9-350d452281e9 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-e16c3193-bfb0-4095-98ff-c8bacc109e97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:54:55 compute-0 nova_compute[183177]: 2026-01-26 19:54:55.487 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquired lock "refresh_cache-e16c3193-bfb0-4095-98ff-c8bacc109e97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:54:55 compute-0 nova_compute[183177]: 2026-01-26 19:54:55.487 183181 DEBUG nova.network.neutron [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:54:55 compute-0 nova_compute[183177]: 2026-01-26 19:54:55.859 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:56 compute-0 nova_compute[183177]: 2026-01-26 19:54:56.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:54:56 compute-0 nova_compute[183177]: 2026-01-26 19:54:56.874 183181 DEBUG nova.network.neutron [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:54:57 compute-0 nova_compute[183177]: 2026-01-26 19:54:57.103 183181 WARNING neutronclient.v2_0.client [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:54:57 compute-0 nova_compute[183177]: 2026-01-26 19:54:57.265 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:57 compute-0 nova_compute[183177]: 2026-01-26 19:54:57.879 183181 DEBUG nova.network.neutron [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Updating instance_info_cache with network_info: [{"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.392 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Releasing lock "refresh_cache-e16c3193-bfb0-4095-98ff-c8bacc109e97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.393 183181 DEBUG nova.compute.manager [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Instance network_info: |[{"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.397 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Start _get_guest_xml network_info=[{"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.401 183181 WARNING nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.403 183181 DEBUG nova.virt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-239345482', uuid='e16c3193-bfb0-4095-98ff-c8bacc109e97'), owner=OwnerMeta(userid='0415606853f441d3b598e0af51d1b700', username='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin', projectid='00d6147467834874bb42a420f895fa88', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769457298.4035914) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.409 183181 DEBUG nova.virt.libvirt.host [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.410 183181 DEBUG nova.virt.libvirt.host [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.412 183181 DEBUG nova.virt.libvirt.host [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.413 183181 DEBUG nova.virt.libvirt.host [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.414 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.415 183181 DEBUG nova.virt.hardware [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.415 183181 DEBUG nova.virt.hardware [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.416 183181 DEBUG nova.virt.hardware [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.416 183181 DEBUG nova.virt.hardware [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.416 183181 DEBUG nova.virt.hardware [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.416 183181 DEBUG nova.virt.hardware [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.417 183181 DEBUG nova.virt.hardware [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.417 183181 DEBUG nova.virt.hardware [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.417 183181 DEBUG nova.virt.hardware [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.417 183181 DEBUG nova.virt.hardware [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.418 183181 DEBUG nova.virt.hardware [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.423 183181 DEBUG nova.virt.libvirt.vif [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:54:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-239345482',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-239',id=18,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='00d6147467834874bb42a420f895fa88',ramdisk_id='',reservation_id='r-aishmv0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:54:53Z,user_data=None,user_id='0415606853f441d3b598e0af51d1b700',uuid=e16c3193-bfb0-4095-98ff-c8bacc109e97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.424 183181 DEBUG nova.network.os_vif_util [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Converting VIF {"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.425 183181 DEBUG nova.network.os_vif_util [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:ed:9e,bridge_name='br-int',has_traffic_filtering=True,id=e1e6b646-6f32-4142-8c54-471be13d3049,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1e6b646-6f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.426 183181 DEBUG nova.objects.instance [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lazy-loading 'pci_devices' on Instance uuid e16c3193-bfb0-4095-98ff-c8bacc109e97 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.970 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] End _get_guest_xml xml=<domain type="kvm">
Jan 26 19:54:58 compute-0 nova_compute[183177]:   <uuid>e16c3193-bfb0-4095-98ff-c8bacc109e97</uuid>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   <name>instance-00000012</name>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-239345482</nova:name>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:54:58</nova:creationTime>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:54:58 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:54:58 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:user uuid="0415606853f441d3b598e0af51d1b700">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin</nova:user>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:project uuid="00d6147467834874bb42a420f895fa88">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351</nova:project>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         <nova:port uuid="e1e6b646-6f32-4142-8c54-471be13d3049">
Jan 26 19:54:58 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <system>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <entry name="serial">e16c3193-bfb0-4095-98ff-c8bacc109e97</entry>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <entry name="uuid">e16c3193-bfb0-4095-98ff-c8bacc109e97</entry>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     </system>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   <os>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   </os>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   <features>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   </features>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk.config"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:5e:ed:9e"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <target dev="tape1e6b646-6f"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/console.log" append="off"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <video>
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     </video>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:54:58 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:54:58 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:54:58 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:54:58 compute-0 nova_compute[183177]: </domain>
Jan 26 19:54:58 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.973 183181 DEBUG nova.compute.manager [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Preparing to wait for external event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.973 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.974 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.974 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.976 183181 DEBUG nova.virt.libvirt.vif [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:54:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-239345482',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-239',id=18,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='00d6147467834874bb42a420f895fa88',ramdisk_id='',reservation_id='r-aishmv0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:54:53Z,user_data=None,user_id='0415606853f441d3b598e0af51d1b700',uuid=e16c3193-bfb0-4095-98ff-c8bacc109e97,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.977 183181 DEBUG nova.network.os_vif_util [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Converting VIF {"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.978 183181 DEBUG nova.network.os_vif_util [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:ed:9e,bridge_name='br-int',has_traffic_filtering=True,id=e1e6b646-6f32-4142-8c54-471be13d3049,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1e6b646-6f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.979 183181 DEBUG os_vif [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:ed:9e,bridge_name='br-int',has_traffic_filtering=True,id=e1e6b646-6f32-4142-8c54-471be13d3049,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1e6b646-6f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.980 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.980 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.981 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.982 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.982 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '142bf9d9-3038-51ee-8eed-891adfd02d29', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.985 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.987 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.993 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.994 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape1e6b646-6f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.995 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tape1e6b646-6f, col_values=(('qos', UUID('9fe12c16-2428-41ef-9486-1a8c31a8229d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.995 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tape1e6b646-6f, col_values=(('external_ids', {'iface-id': 'e1e6b646-6f32-4142-8c54-471be13d3049', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:ed:9e', 'vm-uuid': 'e16c3193-bfb0-4095-98ff-c8bacc109e97'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:54:58 compute-0 nova_compute[183177]: 2026-01-26 19:54:58.998 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:59 compute-0 NetworkManager[55489]: <info>  [1769457299.0010] manager: (tape1e6b646-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Jan 26 19:54:59 compute-0 nova_compute[183177]: 2026-01-26 19:54:59.001 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:54:59 compute-0 nova_compute[183177]: 2026-01-26 19:54:59.008 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:54:59 compute-0 nova_compute[183177]: 2026-01-26 19:54:59.010 183181 INFO os_vif [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:ed:9e,bridge_name='br-int',has_traffic_filtering=True,id=e1e6b646-6f32-4142-8c54-471be13d3049,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1e6b646-6f')
Jan 26 19:54:59 compute-0 podman[192499]: time="2026-01-26T19:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:54:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:54:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Jan 26 19:55:00 compute-0 nova_compute[183177]: 2026-01-26 19:55:00.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:55:00 compute-0 nova_compute[183177]: 2026-01-26 19:55:00.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:55:00 compute-0 nova_compute[183177]: 2026-01-26 19:55:00.554 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:55:00 compute-0 nova_compute[183177]: 2026-01-26 19:55:00.554 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:55:00 compute-0 nova_compute[183177]: 2026-01-26 19:55:00.555 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] No VIF found with MAC fa:16:3e:5e:ed:9e, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 19:55:00 compute-0 nova_compute[183177]: 2026-01-26 19:55:00.555 183181 INFO nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Using config drive
Jan 26 19:55:00 compute-0 nova_compute[183177]: 2026-01-26 19:55:00.664 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:55:00 compute-0 nova_compute[183177]: 2026-01-26 19:55:00.665 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:55:00 compute-0 nova_compute[183177]: 2026-01-26 19:55:00.665 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:55:00 compute-0 nova_compute[183177]: 2026-01-26 19:55:00.666 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:55:00 compute-0 nova_compute[183177]: 2026-01-26 19:55:00.905 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:01 compute-0 nova_compute[183177]: 2026-01-26 19:55:01.069 183181 WARNING neutronclient.v2_0.client [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:55:01 compute-0 openstack_network_exporter[195363]: ERROR   19:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:55:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:55:01 compute-0 openstack_network_exporter[195363]: ERROR   19:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:55:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:55:01 compute-0 sshd-session[210742]: Connection closed by authenticating user root 142.93.140.142 port 33620 [preauth]
Jan 26 19:55:01 compute-0 nova_compute[183177]: 2026-01-26 19:55:01.713 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:55:01 compute-0 nova_compute[183177]: 2026-01-26 19:55:01.805 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:55:01 compute-0 nova_compute[183177]: 2026-01-26 19:55:01.807 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:55:01 compute-0 nova_compute[183177]: 2026-01-26 19:55:01.877 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:55:01 compute-0 nova_compute[183177]: 2026-01-26 19:55:01.879 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000012, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk.config'
Jan 26 19:55:01 compute-0 nova_compute[183177]: 2026-01-26 19:55:01.912 183181 INFO nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Creating config drive at /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk.config
Jan 26 19:55:01 compute-0 nova_compute[183177]: 2026-01-26 19:55:01.921 183181 DEBUG oslo_concurrency.processutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpfxiqc3ua execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.060 183181 DEBUG oslo_concurrency.processutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpfxiqc3ua" returned: 0 in 0.139s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.073 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.074 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.100 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.101 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5723MB free_disk=73.0982780456543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.101 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.102 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:55:02 compute-0 kernel: tape1e6b646-6f: entered promiscuous mode
Jan 26 19:55:02 compute-0 NetworkManager[55489]: <info>  [1769457302.1429] manager: (tape1e6b646-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Jan 26 19:55:02 compute-0 ovn_controller[95396]: 2026-01-26T19:55:02Z|00136|binding|INFO|Claiming lport e1e6b646-6f32-4142-8c54-471be13d3049 for this chassis.
Jan 26 19:55:02 compute-0 ovn_controller[95396]: 2026-01-26T19:55:02Z|00137|binding|INFO|e1e6b646-6f32-4142-8c54-471be13d3049: Claiming fa:16:3e:5e:ed:9e 10.100.0.13
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.146 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:02 compute-0 systemd-udevd[210766]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.174 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:ed:9e 10.100.0.13'], port_security=['fa:16:3e:5e:ed:9e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e16c3193-bfb0-4095-98ff-c8bacc109e97', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '00d6147467834874bb42a420f895fa88', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3f24c591-c667-4b44-9fc4-3f4f62949186', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87196033-f000-4959-a8ea-24ef132800a5, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=e1e6b646-6f32-4142-8c54-471be13d3049) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.175 104672 INFO neutron.agent.ovn.metadata.agent [-] Port e1e6b646-6f32-4142-8c54-471be13d3049 in datapath dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 bound to our chassis
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.176 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.176 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:02 compute-0 ovn_controller[95396]: 2026-01-26T19:55:02Z|00138|binding|INFO|Setting lport e1e6b646-6f32-4142-8c54-471be13d3049 ovn-installed in OVS
Jan 26 19:55:02 compute-0 ovn_controller[95396]: 2026-01-26T19:55:02Z|00139|binding|INFO|Setting lport e1e6b646-6f32-4142-8c54-471be13d3049 up in Southbound
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.180 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.181 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:02 compute-0 NetworkManager[55489]: <info>  [1769457302.1965] device (tape1e6b646-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:55:02 compute-0 NetworkManager[55489]: <info>  [1769457302.1977] device (tape1e6b646-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.195 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[81821b78-7c87-4143-abfc-13c0e57ae819]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.198 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdcbad604-d1 in ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.201 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdcbad604-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.201 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[33238792-abcb-4ace-ab58-0a4376f28ece]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.203 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[27b8f93d-0eba-4e69-b786-920c29a343f7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 systemd-machined[154465]: New machine qemu-12-instance-00000012.
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.217 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[3edd96f9-98f3-41c6-8947-6c266d901be9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.227 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[4346188b-263a-4393-b435-f01925d2afb5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000012.
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.267 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[77d780e3-b3b6-4ff3-9690-036cc5c70c3f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.273 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[342962ba-48ff-4cae-8a09-64a8ef04aa6b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 NetworkManager[55489]: <info>  [1769457302.2744] manager: (tapdcbad604-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/60)
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.322 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[599c15cb-1b4f-4312-bf9d-83bc8577a1f0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.329 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[f0e63cae-b8ad-4f3d-8ef2-8e0fafdacda0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 NetworkManager[55489]: <info>  [1769457302.3599] device (tapdcbad604-d0): carrier: link connected
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.367 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[86da826b-cd21-420a-9502-60f5e38b2f43]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.390 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[61984614-22a4-4565-9b41-c0ab608c3ad8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdcbad604-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:f5:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476124, 'reachable_time': 44277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210802, 'error': None, 'target': 'ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.409 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b4905c3b-ed73-4cc9-8c58-9fd8cef85d5e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0a:f567'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 476124, 'tstamp': 476124}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210803, 'error': None, 'target': 'ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.428 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[fc0f0a8a-8dc6-44c0-b559-9bdf33bf035e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdcbad604-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0a:f5:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476124, 'reachable_time': 44277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210804, 'error': None, 'target': 'ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.476 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2e817579-4984-44ef-a02b-aa8c7456504f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.504 183181 DEBUG nova.compute.manager [req-ee731794-671a-4a55-a746-26f459677440 req-b1bdcc37-91a5-4a33-bab0-34b743d0c4eb 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.505 183181 DEBUG oslo_concurrency.lockutils [req-ee731794-671a-4a55-a746-26f459677440 req-b1bdcc37-91a5-4a33-bab0-34b743d0c4eb 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.505 183181 DEBUG oslo_concurrency.lockutils [req-ee731794-671a-4a55-a746-26f459677440 req-b1bdcc37-91a5-4a33-bab0-34b743d0c4eb 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.506 183181 DEBUG oslo_concurrency.lockutils [req-ee731794-671a-4a55-a746-26f459677440 req-b1bdcc37-91a5-4a33-bab0-34b743d0c4eb 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.506 183181 DEBUG nova.compute.manager [req-ee731794-671a-4a55-a746-26f459677440 req-b1bdcc37-91a5-4a33-bab0-34b743d0c4eb 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Processing event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.565 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.566 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.581 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[87f5a7ec-c52c-49d0-ae23-5600f633a17d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.582 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcbad604-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.583 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.583 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdcbad604-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.585 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:02 compute-0 NetworkManager[55489]: <info>  [1769457302.5866] manager: (tapdcbad604-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Jan 26 19:55:02 compute-0 kernel: tapdcbad604-d0: entered promiscuous mode
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.588 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.589 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdcbad604-d0, col_values=(('external_ids', {'iface-id': '693f04a5-b9eb-445c-95f9-2771857fca3b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.591 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:02 compute-0 ovn_controller[95396]: 2026-01-26T19:55:02Z|00140|binding|INFO|Releasing lport 693f04a5-b9eb-445c-95f9-2771857fca3b from this chassis (sb_readonly=0)
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.595 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.596 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6a666429-e7af-4ddf-8cb1-060cc3eff81d]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.597 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.598 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.598 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.598 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.599 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[24c903e2-c05d-420c-8879-e627628b35af]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.600 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.600 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[0d45356b-087c-4803-96ad-13d8f851baa0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.601 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: global
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 19:55:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:02.603 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'env', 'PROCESS_TAG=haproxy-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 19:55:02 compute-0 nova_compute[183177]: 2026-01-26 19:55:02.609 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.024 183181 DEBUG nova.compute.manager [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.031 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.035 183181 INFO nova.virt.libvirt.driver [-] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Instance spawned successfully.
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.036 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 19:55:03 compute-0 podman[210843]: 2026-01-26 19:55:03.059080586 +0000 UTC m=+0.064680006 container create 934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team)
Jan 26 19:55:03 compute-0 systemd[1]: Started libpod-conmon-934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47.scope.
Jan 26 19:55:03 compute-0 podman[210843]: 2026-01-26 19:55:03.026728073 +0000 UTC m=+0.032327483 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 19:55:03 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:55:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fd037b78690181f1d0f5de2a5f6714364414a4217ce60f24fd8a5dec8f3d027/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.157 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance e16c3193-bfb0-4095-98ff-c8bacc109e97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.158 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.158 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:55:02 up  1:19,  0 user,  load average: 0.30, 0.23, 0.30\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_00d6147467834874bb42a420f895fa88': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:55:03 compute-0 podman[210843]: 2026-01-26 19:55:03.168997764 +0000 UTC m=+0.174597194 container init 934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 19:55:03 compute-0 podman[210843]: 2026-01-26 19:55:03.176958749 +0000 UTC m=+0.182558149 container start 934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 19:55:03 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210858]: [NOTICE]   (210862) : New worker (210864) forked
Jan 26 19:55:03 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210858]: [NOTICE]   (210862) : Loading success.
Jan 26 19:55:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:03.250 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.261 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.552 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.553 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.554 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.555 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.556 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.557 183181 DEBUG nova.virt.libvirt.driver [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.772 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:55:03 compute-0 nova_compute[183177]: 2026-01-26 19:55:03.998 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:04 compute-0 nova_compute[183177]: 2026-01-26 19:55:04.073 183181 INFO nova.compute.manager [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Took 9.84 seconds to spawn the instance on the hypervisor.
Jan 26 19:55:04 compute-0 nova_compute[183177]: 2026-01-26 19:55:04.074 183181 DEBUG nova.compute.manager [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 19:55:04 compute-0 nova_compute[183177]: 2026-01-26 19:55:04.284 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:55:04 compute-0 nova_compute[183177]: 2026-01-26 19:55:04.285 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.183s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:55:04 compute-0 nova_compute[183177]: 2026-01-26 19:55:04.589 183181 DEBUG nova.compute.manager [req-b7c9706f-e621-4136-b791-391fd3b18aed req-d11e3073-bd21-42ad-babb-56f50ee6ad6a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:55:04 compute-0 nova_compute[183177]: 2026-01-26 19:55:04.590 183181 DEBUG oslo_concurrency.lockutils [req-b7c9706f-e621-4136-b791-391fd3b18aed req-d11e3073-bd21-42ad-babb-56f50ee6ad6a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:55:04 compute-0 nova_compute[183177]: 2026-01-26 19:55:04.590 183181 DEBUG oslo_concurrency.lockutils [req-b7c9706f-e621-4136-b791-391fd3b18aed req-d11e3073-bd21-42ad-babb-56f50ee6ad6a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:55:04 compute-0 nova_compute[183177]: 2026-01-26 19:55:04.590 183181 DEBUG oslo_concurrency.lockutils [req-b7c9706f-e621-4136-b791-391fd3b18aed req-d11e3073-bd21-42ad-babb-56f50ee6ad6a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:55:04 compute-0 nova_compute[183177]: 2026-01-26 19:55:04.591 183181 DEBUG nova.compute.manager [req-b7c9706f-e621-4136-b791-391fd3b18aed req-d11e3073-bd21-42ad-babb-56f50ee6ad6a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] No waiting events found dispatching network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:55:04 compute-0 nova_compute[183177]: 2026-01-26 19:55:04.591 183181 WARNING nova.compute.manager [req-b7c9706f-e621-4136-b791-391fd3b18aed req-d11e3073-bd21-42ad-babb-56f50ee6ad6a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received unexpected event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 for instance with vm_state active and task_state None.
Jan 26 19:55:04 compute-0 nova_compute[183177]: 2026-01-26 19:55:04.618 183181 INFO nova.compute.manager [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Took 15.11 seconds to build instance.
Jan 26 19:55:05 compute-0 nova_compute[183177]: 2026-01-26 19:55:05.124 183181 DEBUG oslo_concurrency.lockutils [None req-2c325712-7241-446e-ac25-8389441f1bb9 0415606853f441d3b598e0af51d1b700 00d6147467834874bb42a420f895fa88 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.637s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:55:05 compute-0 nova_compute[183177]: 2026-01-26 19:55:05.287 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:55:05 compute-0 nova_compute[183177]: 2026-01-26 19:55:05.288 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:55:05 compute-0 nova_compute[183177]: 2026-01-26 19:55:05.288 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:55:05 compute-0 nova_compute[183177]: 2026-01-26 19:55:05.908 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:06.252 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:55:07 compute-0 nova_compute[183177]: 2026-01-26 19:55:07.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:55:08 compute-0 nova_compute[183177]: 2026-01-26 19:55:08.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:55:09 compute-0 nova_compute[183177]: 2026-01-26 19:55:09.000 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:10 compute-0 nova_compute[183177]: 2026-01-26 19:55:10.943 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:14 compute-0 nova_compute[183177]: 2026-01-26 19:55:14.003 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:15 compute-0 ovn_controller[95396]: 2026-01-26T19:55:15Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5e:ed:9e 10.100.0.13
Jan 26 19:55:15 compute-0 ovn_controller[95396]: 2026-01-26T19:55:15Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5e:ed:9e 10.100.0.13
Jan 26 19:55:15 compute-0 nova_compute[183177]: 2026-01-26 19:55:15.969 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:17 compute-0 podman[210898]: 2026-01-26 19:55:17.425474765 +0000 UTC m=+0.170390181 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 26 19:55:19 compute-0 nova_compute[183177]: 2026-01-26 19:55:19.007 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:19 compute-0 podman[210924]: 2026-01-26 19:55:19.333966139 +0000 UTC m=+0.087952576 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 26 19:55:19 compute-0 podman[210925]: 2026-01-26 19:55:19.341267526 +0000 UTC m=+0.083058603 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Jan 26 19:55:20 compute-0 nova_compute[183177]: 2026-01-26 19:55:20.973 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:24 compute-0 nova_compute[183177]: 2026-01-26 19:55:24.009 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:24.068 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:55:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:24.069 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:55:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:55:24.070 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:55:25 compute-0 podman[210964]: 2026-01-26 19:55:25.342954877 +0000 UTC m=+0.078312915 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:55:26 compute-0 nova_compute[183177]: 2026-01-26 19:55:26.027 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:29 compute-0 nova_compute[183177]: 2026-01-26 19:55:29.012 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:29 compute-0 podman[192499]: time="2026-01-26T19:55:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:55:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:55:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:55:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:55:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2647 "" "Go-http-client/1.1"
Jan 26 19:55:31 compute-0 nova_compute[183177]: 2026-01-26 19:55:31.031 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:31 compute-0 openstack_network_exporter[195363]: ERROR   19:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:55:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:55:31 compute-0 openstack_network_exporter[195363]: ERROR   19:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:55:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:55:34 compute-0 nova_compute[183177]: 2026-01-26 19:55:34.015 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:36 compute-0 nova_compute[183177]: 2026-01-26 19:55:36.051 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:39 compute-0 nova_compute[183177]: 2026-01-26 19:55:39.016 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:41 compute-0 nova_compute[183177]: 2026-01-26 19:55:41.094 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:42 compute-0 sshd-session[210988]: Connection closed by authenticating user root 142.93.140.142 port 34504 [preauth]
Jan 26 19:55:44 compute-0 nova_compute[183177]: 2026-01-26 19:55:44.019 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:45 compute-0 nova_compute[183177]: 2026-01-26 19:55:45.160 183181 DEBUG nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Check if temp file /var/lib/nova/instances/tmp47pzsps6 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 19:55:45 compute-0 nova_compute[183177]: 2026-01-26 19:55:45.167 183181 DEBUG nova.compute.manager [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp47pzsps6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e16c3193-bfb0-4095-98ff-c8bacc109e97',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 19:55:46 compute-0 nova_compute[183177]: 2026-01-26 19:55:46.097 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:48 compute-0 podman[210990]: 2026-01-26 19:55:48.398851921 +0000 UTC m=+0.140109314 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:55:49 compute-0 nova_compute[183177]: 2026-01-26 19:55:49.022 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:50 compute-0 nova_compute[183177]: 2026-01-26 19:55:50.021 183181 DEBUG oslo_concurrency.processutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:55:50 compute-0 nova_compute[183177]: 2026-01-26 19:55:50.123 183181 DEBUG oslo_concurrency.processutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:55:50 compute-0 nova_compute[183177]: 2026-01-26 19:55:50.125 183181 DEBUG oslo_concurrency.processutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:55:50 compute-0 nova_compute[183177]: 2026-01-26 19:55:50.196 183181 DEBUG oslo_concurrency.processutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:55:50 compute-0 nova_compute[183177]: 2026-01-26 19:55:50.197 183181 DEBUG nova.compute.manager [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Preparing to wait for external event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:55:50 compute-0 nova_compute[183177]: 2026-01-26 19:55:50.197 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:55:50 compute-0 nova_compute[183177]: 2026-01-26 19:55:50.198 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:55:50 compute-0 nova_compute[183177]: 2026-01-26 19:55:50.198 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:55:50 compute-0 podman[211023]: 2026-01-26 19:55:50.326991476 +0000 UTC m=+0.073911306 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 19:55:50 compute-0 podman[211024]: 2026-01-26 19:55:50.32754948 +0000 UTC m=+0.073613858 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:55:51 compute-0 nova_compute[183177]: 2026-01-26 19:55:51.101 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:54 compute-0 nova_compute[183177]: 2026-01-26 19:55:54.024 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:55 compute-0 nova_compute[183177]: 2026-01-26 19:55:55.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:55:56 compute-0 nova_compute[183177]: 2026-01-26 19:55:56.103 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:56 compute-0 nova_compute[183177]: 2026-01-26 19:55:56.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:55:56 compute-0 ovn_controller[95396]: 2026-01-26T19:55:56Z|00141|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Jan 26 19:55:56 compute-0 nova_compute[183177]: 2026-01-26 19:55:56.235 183181 DEBUG nova.compute.manager [req-b349b0f6-c556-4a1e-8e5a-3ddc510480d4 req-80253098-1c72-4865-9296-c378a83f1a4b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-unplugged-e1e6b646-6f32-4142-8c54-471be13d3049 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:55:56 compute-0 nova_compute[183177]: 2026-01-26 19:55:56.236 183181 DEBUG oslo_concurrency.lockutils [req-b349b0f6-c556-4a1e-8e5a-3ddc510480d4 req-80253098-1c72-4865-9296-c378a83f1a4b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:55:56 compute-0 nova_compute[183177]: 2026-01-26 19:55:56.236 183181 DEBUG oslo_concurrency.lockutils [req-b349b0f6-c556-4a1e-8e5a-3ddc510480d4 req-80253098-1c72-4865-9296-c378a83f1a4b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:55:56 compute-0 nova_compute[183177]: 2026-01-26 19:55:56.237 183181 DEBUG oslo_concurrency.lockutils [req-b349b0f6-c556-4a1e-8e5a-3ddc510480d4 req-80253098-1c72-4865-9296-c378a83f1a4b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:55:56 compute-0 nova_compute[183177]: 2026-01-26 19:55:56.237 183181 DEBUG nova.compute.manager [req-b349b0f6-c556-4a1e-8e5a-3ddc510480d4 req-80253098-1c72-4865-9296-c378a83f1a4b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] No event matching network-vif-unplugged-e1e6b646-6f32-4142-8c54-471be13d3049 in dict_keys([('network-vif-plugged', 'e1e6b646-6f32-4142-8c54-471be13d3049')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 19:55:56 compute-0 nova_compute[183177]: 2026-01-26 19:55:56.237 183181 DEBUG nova.compute.manager [req-b349b0f6-c556-4a1e-8e5a-3ddc510480d4 req-80253098-1c72-4865-9296-c378a83f1a4b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-unplugged-e1e6b646-6f32-4142-8c54-471be13d3049 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:55:56 compute-0 podman[211063]: 2026-01-26 19:55:56.356061091 +0000 UTC m=+0.095299953 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:55:58 compute-0 nova_compute[183177]: 2026-01-26 19:55:58.324 183181 DEBUG nova.compute.manager [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:55:58 compute-0 nova_compute[183177]: 2026-01-26 19:55:58.324 183181 DEBUG oslo_concurrency.lockutils [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:55:58 compute-0 nova_compute[183177]: 2026-01-26 19:55:58.325 183181 DEBUG oslo_concurrency.lockutils [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:55:58 compute-0 nova_compute[183177]: 2026-01-26 19:55:58.325 183181 DEBUG oslo_concurrency.lockutils [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:55:58 compute-0 nova_compute[183177]: 2026-01-26 19:55:58.325 183181 DEBUG nova.compute.manager [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Processing event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:55:58 compute-0 nova_compute[183177]: 2026-01-26 19:55:58.326 183181 DEBUG nova.compute.manager [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-changed-e1e6b646-6f32-4142-8c54-471be13d3049 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:55:58 compute-0 nova_compute[183177]: 2026-01-26 19:55:58.327 183181 DEBUG nova.compute.manager [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Refreshing instance network info cache due to event network-changed-e1e6b646-6f32-4142-8c54-471be13d3049. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:55:58 compute-0 nova_compute[183177]: 2026-01-26 19:55:58.327 183181 DEBUG oslo_concurrency.lockutils [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-e16c3193-bfb0-4095-98ff-c8bacc109e97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:55:58 compute-0 nova_compute[183177]: 2026-01-26 19:55:58.327 183181 DEBUG oslo_concurrency.lockutils [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-e16c3193-bfb0-4095-98ff-c8bacc109e97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:55:58 compute-0 nova_compute[183177]: 2026-01-26 19:55:58.328 183181 DEBUG nova.network.neutron [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Refreshing network info cache for port e1e6b646-6f32-4142-8c54-471be13d3049 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:55:58 compute-0 nova_compute[183177]: 2026-01-26 19:55:58.731 183181 INFO nova.compute.manager [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Took 8.53 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 26 19:55:58 compute-0 nova_compute[183177]: 2026-01-26 19:55:58.732 183181 DEBUG nova.compute.manager [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:55:58 compute-0 nova_compute[183177]: 2026-01-26 19:55:58.835 183181 WARNING neutronclient.v2_0.client [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:55:59 compute-0 nova_compute[183177]: 2026-01-26 19:55:59.026 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:55:59 compute-0 nova_compute[183177]: 2026-01-26 19:55:59.240 183181 DEBUG nova.compute.manager [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp47pzsps6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e16c3193-bfb0-4095-98ff-c8bacc109e97',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(da4dd4c0-c5df-4f9e-a201-2924271c879f),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 19:55:59 compute-0 podman[192499]: time="2026-01-26T19:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:55:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:55:59 compute-0 nova_compute[183177]: 2026-01-26 19:55:59.762 183181 DEBUG nova.objects.instance [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid e16c3193-bfb0-4095-98ff-c8bacc109e97 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:55:59 compute-0 nova_compute[183177]: 2026-01-26 19:55:59.764 183181 DEBUG nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 19:55:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Jan 26 19:55:59 compute-0 nova_compute[183177]: 2026-01-26 19:55:59.767 183181 DEBUG nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 19:55:59 compute-0 nova_compute[183177]: 2026-01-26 19:55:59.767 183181 DEBUG nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 19:56:00 compute-0 nova_compute[183177]: 2026-01-26 19:56:00.270 183181 DEBUG nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 19:56:00 compute-0 nova_compute[183177]: 2026-01-26 19:56:00.271 183181 DEBUG nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 19:56:00 compute-0 nova_compute[183177]: 2026-01-26 19:56:00.290 183181 DEBUG nova.virt.libvirt.vif [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:54:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-239345482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-239',id=18,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:55:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='00d6147467834874bb42a420f895fa88',ramdisk_id='',reservation_id='r-aishmv0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:55:04Z,user_data=None,user_id='0415606853f441d3b598e0af51d1b700',uuid=e16c3193-bfb0-4095-98ff-c8bacc109e97,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:56:00 compute-0 nova_compute[183177]: 2026-01-26 19:56:00.291 183181 DEBUG nova.network.os_vif_util [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:56:00 compute-0 nova_compute[183177]: 2026-01-26 19:56:00.291 183181 DEBUG nova.network.os_vif_util [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:ed:9e,bridge_name='br-int',has_traffic_filtering=True,id=e1e6b646-6f32-4142-8c54-471be13d3049,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1e6b646-6f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:56:00 compute-0 nova_compute[183177]: 2026-01-26 19:56:00.292 183181 DEBUG nova.virt.libvirt.migration [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <mac address="fa:16:3e:5e:ed:9e"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <model type="virtio"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <mtu size="1442"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <target dev="tape1e6b646-6f"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]: </interface>
Jan 26 19:56:00 compute-0 nova_compute[183177]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 19:56:00 compute-0 nova_compute[183177]: 2026-01-26 19:56:00.293 183181 DEBUG nova.virt.libvirt.migration [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <name>instance-00000012</name>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <uuid>e16c3193-bfb0-4095-98ff-c8bacc109e97</uuid>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-239345482</nova:name>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:54:58</nova:creationTime>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:56:00 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:56:00 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:user uuid="0415606853f441d3b598e0af51d1b700">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin</nova:user>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:project uuid="00d6147467834874bb42a420f895fa88">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351</nova:project>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:port uuid="e1e6b646-6f32-4142-8c54-471be13d3049">
Jan 26 19:56:00 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <system>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="serial">e16c3193-bfb0-4095-98ff-c8bacc109e97</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="uuid">e16c3193-bfb0-4095-98ff-c8bacc109e97</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </system>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <os>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </os>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <features>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </features>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk.config"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:5e:ed:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape1e6b646-6f"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/console.log" append="off"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </target>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/console.log" append="off"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </console>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </input>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <video>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </video>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]: </domain>
Jan 26 19:56:00 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 19:56:00 compute-0 nova_compute[183177]: 2026-01-26 19:56:00.294 183181 DEBUG nova.virt.libvirt.migration [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <name>instance-00000012</name>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <uuid>e16c3193-bfb0-4095-98ff-c8bacc109e97</uuid>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-239345482</nova:name>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:54:58</nova:creationTime>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:56:00 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:56:00 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:user uuid="0415606853f441d3b598e0af51d1b700">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin</nova:user>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:project uuid="00d6147467834874bb42a420f895fa88">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351</nova:project>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:port uuid="e1e6b646-6f32-4142-8c54-471be13d3049">
Jan 26 19:56:00 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <system>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="serial">e16c3193-bfb0-4095-98ff-c8bacc109e97</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="uuid">e16c3193-bfb0-4095-98ff-c8bacc109e97</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </system>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <os>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </os>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <features>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </features>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk.config"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:5e:ed:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape1e6b646-6f"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/console.log" append="off"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </target>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/console.log" append="off"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </console>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </input>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <video>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </video>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]: </domain>
Jan 26 19:56:00 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 19:56:00 compute-0 nova_compute[183177]: 2026-01-26 19:56:00.295 183181 DEBUG nova.virt.libvirt.migration [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <name>instance-00000012</name>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <uuid>e16c3193-bfb0-4095-98ff-c8bacc109e97</uuid>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-239345482</nova:name>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:54:58</nova:creationTime>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:56:00 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:56:00 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:user uuid="0415606853f441d3b598e0af51d1b700">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin</nova:user>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:project uuid="00d6147467834874bb42a420f895fa88">tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351</nova:project>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <nova:port uuid="e1e6b646-6f32-4142-8c54-471be13d3049">
Jan 26 19:56:00 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <resource>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </resource>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <system>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="serial">e16c3193-bfb0-4095-98ff-c8bacc109e97</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="uuid">e16c3193-bfb0-4095-98ff-c8bacc109e97</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </system>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <os>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </os>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <features>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </features>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/disk.config"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </controller>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:5e:ed:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape1e6b646-6f"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/console.log" append="off"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 19:56:00 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       </target>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97/console.log" append="off"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </console>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </input>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </graphics>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <video>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </video>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:56:00 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:56:00 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 19:56:00 compute-0 nova_compute[183177]: </domain>
Jan 26 19:56:00 compute-0 nova_compute[183177]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 19:56:00 compute-0 nova_compute[183177]: 2026-01-26 19:56:00.296 183181 DEBUG nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 19:56:00 compute-0 nova_compute[183177]: 2026-01-26 19:56:00.773 183181 DEBUG nova.virt.libvirt.migration [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 19:56:00 compute-0 nova_compute[183177]: 2026-01-26 19:56:00.774 183181 INFO nova.virt.libvirt.migration [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 19:56:00 compute-0 nova_compute[183177]: 2026-01-26 19:56:00.837 183181 WARNING neutronclient.v2_0.client [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:56:01 compute-0 nova_compute[183177]: 2026-01-26 19:56:01.051 183181 DEBUG nova.network.neutron [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Updated VIF entry in instance network info cache for port e1e6b646-6f32-4142-8c54-471be13d3049. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 19:56:01 compute-0 nova_compute[183177]: 2026-01-26 19:56:01.052 183181 DEBUG nova.network.neutron [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Updating instance_info_cache with network_info: [{"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:56:01 compute-0 nova_compute[183177]: 2026-01-26 19:56:01.105 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:01 compute-0 nova_compute[183177]: 2026-01-26 19:56:01.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:56:01 compute-0 openstack_network_exporter[195363]: ERROR   19:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:56:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:56:01 compute-0 openstack_network_exporter[195363]: ERROR   19:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:56:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:56:01 compute-0 nova_compute[183177]: 2026-01-26 19:56:01.562 183181 DEBUG oslo_concurrency.lockutils [req-34837b6e-a8a5-4677-9941-d23fc637c15a req-1db5b993-b50a-4ca7-a8f6-6626953079b0 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-e16c3193-bfb0-4095-98ff-c8bacc109e97" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:56:01 compute-0 nova_compute[183177]: 2026-01-26 19:56:01.665 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:56:01 compute-0 nova_compute[183177]: 2026-01-26 19:56:01.666 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:56:01 compute-0 nova_compute[183177]: 2026-01-26 19:56:01.667 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:56:01 compute-0 nova_compute[183177]: 2026-01-26 19:56:01.667 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:56:01 compute-0 nova_compute[183177]: 2026-01-26 19:56:01.792 183181 INFO nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 19:56:02 compute-0 kernel: tape1e6b646-6f (unregistering): left promiscuous mode
Jan 26 19:56:02 compute-0 NetworkManager[55489]: <info>  [1769457362.2019] device (tape1e6b646-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:56:02 compute-0 ovn_controller[95396]: 2026-01-26T19:56:02Z|00142|binding|INFO|Releasing lport e1e6b646-6f32-4142-8c54-471be13d3049 from this chassis (sb_readonly=0)
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.213 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:02 compute-0 ovn_controller[95396]: 2026-01-26T19:56:02Z|00143|binding|INFO|Setting lport e1e6b646-6f32-4142-8c54-471be13d3049 down in Southbound
Jan 26 19:56:02 compute-0 ovn_controller[95396]: 2026-01-26T19:56:02Z|00144|binding|INFO|Removing iface tape1e6b646-6f ovn-installed in OVS
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.218 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.223 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:ed:9e 10.100.0.13'], port_security=['fa:16:3e:5e:ed:9e 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c86a094a-ada8-46ad-9c23-c857e9a7b834'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e16c3193-bfb0-4095-98ff-c8bacc109e97', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '00d6147467834874bb42a420f895fa88', 'neutron:revision_number': '10', 'neutron:security_group_ids': '3f24c591-c667-4b44-9fc4-3f4f62949186', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87196033-f000-4959-a8ea-24ef132800a5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=e1e6b646-6f32-4142-8c54-471be13d3049) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.227 104672 INFO neutron.agent.ovn.metadata.agent [-] Port e1e6b646-6f32-4142-8c54-471be13d3049 in datapath dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 unbound from our chassis
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.229 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.232 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[777355f5-fc4c-401e-9768-bf6ef5fbaf24]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.233 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 namespace which is not needed anymore
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.246 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:02 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 26 19:56:02 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Consumed 16.851s CPU time.
Jan 26 19:56:02 compute-0 systemd-machined[154465]: Machine qemu-12-instance-00000012 terminated.
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.434 183181 DEBUG nova.compute.manager [req-79b5e9bb-f5b9-49b8-bfc7-3df405e8f7f5 req-c6a51784-86f1-49f8-8fd3-d06b58da0d9e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-unplugged-e1e6b646-6f32-4142-8c54-471be13d3049 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.437 183181 DEBUG oslo_concurrency.lockutils [req-79b5e9bb-f5b9-49b8-bfc7-3df405e8f7f5 req-c6a51784-86f1-49f8-8fd3-d06b58da0d9e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.437 183181 DEBUG oslo_concurrency.lockutils [req-79b5e9bb-f5b9-49b8-bfc7-3df405e8f7f5 req-c6a51784-86f1-49f8-8fd3-d06b58da0d9e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.437 183181 DEBUG oslo_concurrency.lockutils [req-79b5e9bb-f5b9-49b8-bfc7-3df405e8f7f5 req-c6a51784-86f1-49f8-8fd3-d06b58da0d9e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.438 183181 DEBUG nova.compute.manager [req-79b5e9bb-f5b9-49b8-bfc7-3df405e8f7f5 req-c6a51784-86f1-49f8-8fd3-d06b58da0d9e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] No waiting events found dispatching network-vif-unplugged-e1e6b646-6f32-4142-8c54-471be13d3049 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.438 183181 DEBUG nova.compute.manager [req-79b5e9bb-f5b9-49b8-bfc7-3df405e8f7f5 req-c6a51784-86f1-49f8-8fd3-d06b58da0d9e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-unplugged-e1e6b646-6f32-4142-8c54-471be13d3049 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:56:02 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210858]: [NOTICE]   (210862) : haproxy version is 3.0.5-8e879a5
Jan 26 19:56:02 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210858]: [NOTICE]   (210862) : path to executable is /usr/sbin/haproxy
Jan 26 19:56:02 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210858]: [WARNING]  (210862) : Exiting Master process...
Jan 26 19:56:02 compute-0 podman[211130]: 2026-01-26 19:56:02.457015707 +0000 UTC m=+0.064160353 container kill 934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Jan 26 19:56:02 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210858]: [ALERT]    (210862) : Current worker (210864) exited with code 143 (Terminated)
Jan 26 19:56:02 compute-0 neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4[210858]: [WARNING]  (210862) : All workers exited. Exiting... (0)
Jan 26 19:56:02 compute-0 systemd[1]: libpod-934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47.scope: Deactivated successfully.
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.461 183181 DEBUG nova.virt.libvirt.guest [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.462 183181 INFO nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Migration operation has completed
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.464 183181 INFO nova.compute.manager [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] _post_live_migration() is started..
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.467 183181 DEBUG nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.468 183181 DEBUG nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.469 183181 DEBUG nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.493 183181 WARNING neutronclient.v2_0.client [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.495 183181 WARNING neutronclient.v2_0.client [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:56:02 compute-0 podman[211162]: 2026-01-26 19:56:02.507632623 +0000 UTC m=+0.033035362 container died 934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 19:56:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47-userdata-shm.mount: Deactivated successfully.
Jan 26 19:56:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-6fd037b78690181f1d0f5de2a5f6714364414a4217ce60f24fd8a5dec8f3d027-merged.mount: Deactivated successfully.
Jan 26 19:56:02 compute-0 podman[211162]: 2026-01-26 19:56:02.56199285 +0000 UTC m=+0.087395559 container cleanup 934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, io.buildah.version=1.41.4)
Jan 26 19:56:02 compute-0 systemd[1]: libpod-conmon-934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47.scope: Deactivated successfully.
Jan 26 19:56:02 compute-0 podman[211170]: 2026-01-26 19:56:02.5849633 +0000 UTC m=+0.081645175 container remove 934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.594 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[29f7289f-b719-48bb-9664-f3aed5bed555]: (4, ("Mon Jan 26 07:56:02 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 (934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47)\n934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47\nMon Jan 26 07:56:02 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 (934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47)\n934d4b1db10e6536a4600e14c8bba9f5504b0dc0638c4b2b98e8261909f44e47\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.596 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[fa1d9268-37a6-4d28-a26d-54a0278f85c2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.597 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.598 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[eaa6df6e-34e5-4bf9-bfd0-a00186a40129]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.599 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcbad604-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.602 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.626 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:02 compute-0 kernel: tapdcbad604-d0: left promiscuous mode
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.635 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.640 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[4383f800-70a8-4e0f-bdcc-90891f505d45]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.653 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a531e379-7729-4681-b9c1-7d00ec4c1c97]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.654 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[c19edd45-b7a6-490c-a7f7-f404e328e090]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.684 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f773be21-40ca-4b6e-8617-24444381ad0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 476114, 'reachable_time': 27279, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211200, 'error': None, 'target': 'ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.688 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 19:56:02 compute-0 systemd[1]: run-netns-ovnmeta\x2ddcbad604\x2ddc9d\x2d41d5\x2da1f5\x2d0fd2f7c5a4b4.mount: Deactivated successfully.
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.688 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bfdff5-f3a2-4a9b-918d-e953f4495996]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.733 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Error from libvirt while getting description of instance-00000012: [Error Code 42] Domain not found: no domain with matching uuid 'e16c3193-bfb0-4095-98ff-c8bacc109e97' (instance-00000012): libvirt.libvirtError: Domain not found: no domain with matching uuid 'e16c3193-bfb0-4095-98ff-c8bacc109e97' (instance-00000012)
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.749 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.749 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:02.750 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.917 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.918 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.951 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.952 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5549MB free_disk=73.06978988647461GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.953 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.954 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.982 183181 DEBUG nova.compute.manager [req-d46b1a68-3a13-475e-a840-74630ff57714 req-d7ed068b-e6e5-49b0-9cb4-d6cf5bba5dc2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-unplugged-e1e6b646-6f32-4142-8c54-471be13d3049 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.983 183181 DEBUG oslo_concurrency.lockutils [req-d46b1a68-3a13-475e-a840-74630ff57714 req-d7ed068b-e6e5-49b0-9cb4-d6cf5bba5dc2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.983 183181 DEBUG oslo_concurrency.lockutils [req-d46b1a68-3a13-475e-a840-74630ff57714 req-d7ed068b-e6e5-49b0-9cb4-d6cf5bba5dc2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.984 183181 DEBUG oslo_concurrency.lockutils [req-d46b1a68-3a13-475e-a840-74630ff57714 req-d7ed068b-e6e5-49b0-9cb4-d6cf5bba5dc2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.984 183181 DEBUG nova.compute.manager [req-d46b1a68-3a13-475e-a840-74630ff57714 req-d7ed068b-e6e5-49b0-9cb4-d6cf5bba5dc2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] No waiting events found dispatching network-vif-unplugged-e1e6b646-6f32-4142-8c54-471be13d3049 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:56:02 compute-0 nova_compute[183177]: 2026-01-26 19:56:02.985 183181 DEBUG nova.compute.manager [req-d46b1a68-3a13-475e-a840-74630ff57714 req-d7ed068b-e6e5-49b0-9cb4-d6cf5bba5dc2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-unplugged-e1e6b646-6f32-4142-8c54-471be13d3049 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.056 183181 DEBUG nova.network.neutron [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Activated binding for port e1e6b646-6f32-4142-8c54-471be13d3049 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.057 183181 DEBUG nova.compute.manager [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.058 183181 DEBUG nova.virt.libvirt.vif [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:54:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-239345482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-239',id=18,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:55:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='00d6147467834874bb42a420f895fa88',ramdisk_id='',reservation_id='r-aishmv0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1782412351-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:55:39Z,user_data=None,user_id='0415606853f441d3b598e0af51d1b700',uuid=e16c3193-bfb0-4095-98ff-c8bacc109e97,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.059 183181 DEBUG nova.network.os_vif_util [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "e1e6b646-6f32-4142-8c54-471be13d3049", "address": "fa:16:3e:5e:ed:9e", "network": {"id": "dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1592311190-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b15450fb68e4a298e85f1a6a3da0e80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e6b646-6f", "ovs_interfaceid": "e1e6b646-6f32-4142-8c54-471be13d3049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.060 183181 DEBUG nova.network.os_vif_util [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:ed:9e,bridge_name='br-int',has_traffic_filtering=True,id=e1e6b646-6f32-4142-8c54-471be13d3049,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1e6b646-6f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.061 183181 DEBUG os_vif [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:ed:9e,bridge_name='br-int',has_traffic_filtering=True,id=e1e6b646-6f32-4142-8c54-471be13d3049,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1e6b646-6f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.065 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.066 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1e6b646-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.068 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.070 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.072 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.072 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=9fe12c16-2428-41ef-9486-1a8c31a8229d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.073 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.074 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.078 183181 INFO os_vif [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:ed:9e,bridge_name='br-int',has_traffic_filtering=True,id=e1e6b646-6f32-4142-8c54-471be13d3049,network=Network(dcbad604-dc9d-41d5-a1f5-0fd2f7c5a4b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1e6b646-6f')
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.079 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:56:03 compute-0 nova_compute[183177]: 2026-01-26 19:56:03.986 183181 INFO nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Updating resource usage from migration da4dd4c0-c5df-4f9e-a201-2924271c879f
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.028 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Migration da4dd4c0-c5df-4f9e-a201-2924271c879f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.029 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.029 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:56:02 up  1:20,  0 user,  load average: 0.33, 0.27, 0.31\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_00d6147467834874bb42a420f895fa88': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.072 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.531 183181 DEBUG nova.compute.manager [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.531 183181 DEBUG oslo_concurrency.lockutils [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.532 183181 DEBUG oslo_concurrency.lockutils [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.532 183181 DEBUG oslo_concurrency.lockutils [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.532 183181 DEBUG nova.compute.manager [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] No waiting events found dispatching network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.533 183181 WARNING nova.compute.manager [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received unexpected event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 for instance with vm_state active and task_state migrating.
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.533 183181 DEBUG nova.compute.manager [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-unplugged-e1e6b646-6f32-4142-8c54-471be13d3049 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.533 183181 DEBUG oslo_concurrency.lockutils [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.534 183181 DEBUG oslo_concurrency.lockutils [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.534 183181 DEBUG oslo_concurrency.lockutils [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.534 183181 DEBUG nova.compute.manager [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] No waiting events found dispatching network-vif-unplugged-e1e6b646-6f32-4142-8c54-471be13d3049 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.535 183181 DEBUG nova.compute.manager [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-unplugged-e1e6b646-6f32-4142-8c54-471be13d3049 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.535 183181 DEBUG nova.compute.manager [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.535 183181 DEBUG oslo_concurrency.lockutils [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.536 183181 DEBUG oslo_concurrency.lockutils [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.536 183181 DEBUG oslo_concurrency.lockutils [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.536 183181 DEBUG nova.compute.manager [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] No waiting events found dispatching network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.537 183181 WARNING nova.compute.manager [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received unexpected event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 for instance with vm_state active and task_state migrating.
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.537 183181 DEBUG nova.compute.manager [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.537 183181 DEBUG oslo_concurrency.lockutils [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.538 183181 DEBUG oslo_concurrency.lockutils [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.538 183181 DEBUG oslo_concurrency.lockutils [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.539 183181 DEBUG nova.compute.manager [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] No waiting events found dispatching network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.539 183181 WARNING nova.compute.manager [req-6c3ea4c4-90b7-44e3-af6f-04e83b68f511 req-57ac2fc1-2793-418e-85d5-b7ca47769837 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Received unexpected event network-vif-plugged-e1e6b646-6f32-4142-8c54-471be13d3049 for instance with vm_state active and task_state migrating.
Jan 26 19:56:04 compute-0 nova_compute[183177]: 2026-01-26 19:56:04.579 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:56:05 compute-0 nova_compute[183177]: 2026-01-26 19:56:05.098 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:56:05 compute-0 nova_compute[183177]: 2026-01-26 19:56:05.098 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.145s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:56:05 compute-0 nova_compute[183177]: 2026-01-26 19:56:05.099 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 2.020s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:56:05 compute-0 nova_compute[183177]: 2026-01-26 19:56:05.099 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:56:05 compute-0 nova_compute[183177]: 2026-01-26 19:56:05.099 183181 DEBUG nova.compute.manager [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 19:56:05 compute-0 nova_compute[183177]: 2026-01-26 19:56:05.100 183181 INFO nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Deleting instance files /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97_del
Jan 26 19:56:05 compute-0 nova_compute[183177]: 2026-01-26 19:56:05.101 183181 INFO nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Deletion of /var/lib/nova/instances/e16c3193-bfb0-4095-98ff-c8bacc109e97_del complete
Jan 26 19:56:06 compute-0 nova_compute[183177]: 2026-01-26 19:56:06.101 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:56:06 compute-0 nova_compute[183177]: 2026-01-26 19:56:06.102 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:56:06 compute-0 nova_compute[183177]: 2026-01-26 19:56:06.102 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:56:06 compute-0 nova_compute[183177]: 2026-01-26 19:56:06.107 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:06 compute-0 nova_compute[183177]: 2026-01-26 19:56:06.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:56:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:06.752 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:56:08 compute-0 nova_compute[183177]: 2026-01-26 19:56:08.105 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:08 compute-0 nova_compute[183177]: 2026-01-26 19:56:08.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:56:08 compute-0 nova_compute[183177]: 2026-01-26 19:56:08.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:56:11 compute-0 nova_compute[183177]: 2026-01-26 19:56:11.109 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:13 compute-0 nova_compute[183177]: 2026-01-26 19:56:13.107 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:14 compute-0 nova_compute[183177]: 2026-01-26 19:56:14.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:56:14 compute-0 nova_compute[183177]: 2026-01-26 19:56:14.639 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:56:14 compute-0 nova_compute[183177]: 2026-01-26 19:56:14.640 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:56:14 compute-0 nova_compute[183177]: 2026-01-26 19:56:14.640 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e16c3193-bfb0-4095-98ff-c8bacc109e97-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:56:15 compute-0 nova_compute[183177]: 2026-01-26 19:56:15.156 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:56:15 compute-0 nova_compute[183177]: 2026-01-26 19:56:15.157 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:56:15 compute-0 nova_compute[183177]: 2026-01-26 19:56:15.157 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:56:15 compute-0 nova_compute[183177]: 2026-01-26 19:56:15.158 183181 DEBUG nova.compute.resource_tracker [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:56:15 compute-0 nova_compute[183177]: 2026-01-26 19:56:15.389 183181 WARNING nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:56:15 compute-0 nova_compute[183177]: 2026-01-26 19:56:15.391 183181 DEBUG oslo_concurrency.processutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:56:15 compute-0 nova_compute[183177]: 2026-01-26 19:56:15.429 183181 DEBUG oslo_concurrency.processutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:56:15 compute-0 nova_compute[183177]: 2026-01-26 19:56:15.431 183181 DEBUG nova.compute.resource_tracker [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5724MB free_disk=73.09869384765625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:56:15 compute-0 nova_compute[183177]: 2026-01-26 19:56:15.431 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:56:15 compute-0 nova_compute[183177]: 2026-01-26 19:56:15.432 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:56:16 compute-0 nova_compute[183177]: 2026-01-26 19:56:16.112 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:16 compute-0 nova_compute[183177]: 2026-01-26 19:56:16.463 183181 DEBUG nova.compute.resource_tracker [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance e16c3193-bfb0-4095-98ff-c8bacc109e97 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 19:56:16 compute-0 nova_compute[183177]: 2026-01-26 19:56:16.975 183181 DEBUG nova.compute.resource_tracker [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 19:56:17 compute-0 nova_compute[183177]: 2026-01-26 19:56:17.020 183181 DEBUG nova.compute.resource_tracker [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration da4dd4c0-c5df-4f9e-a201-2924271c879f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 19:56:17 compute-0 nova_compute[183177]: 2026-01-26 19:56:17.021 183181 DEBUG nova.compute.resource_tracker [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:56:17 compute-0 nova_compute[183177]: 2026-01-26 19:56:17.021 183181 DEBUG nova.compute.resource_tracker [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:56:15 up  1:20,  0 user,  load average: 0.27, 0.26, 0.31\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:56:17 compute-0 nova_compute[183177]: 2026-01-26 19:56:17.071 183181 DEBUG nova.compute.provider_tree [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:56:17 compute-0 nova_compute[183177]: 2026-01-26 19:56:17.586 183181 DEBUG nova.scheduler.client.report [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:56:18 compute-0 nova_compute[183177]: 2026-01-26 19:56:18.107 183181 DEBUG nova.compute.resource_tracker [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:56:18 compute-0 nova_compute[183177]: 2026-01-26 19:56:18.108 183181 DEBUG oslo_concurrency.lockutils [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.676s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:56:18 compute-0 nova_compute[183177]: 2026-01-26 19:56:18.114 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:18 compute-0 nova_compute[183177]: 2026-01-26 19:56:18.136 183181 INFO nova.compute.manager [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 26 19:56:19 compute-0 nova_compute[183177]: 2026-01-26 19:56:19.237 183181 INFO nova.scheduler.client.report [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration da4dd4c0-c5df-4f9e-a201-2924271c879f
Jan 26 19:56:19 compute-0 nova_compute[183177]: 2026-01-26 19:56:19.238 183181 DEBUG nova.virt.libvirt.driver [None req-79acfa03-06ce-4c63-9ac6-c9f4d9ac02ae 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e16c3193-bfb0-4095-98ff-c8bacc109e97] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 19:56:19 compute-0 podman[211206]: 2026-01-26 19:56:19.41027051 +0000 UTC m=+0.142661322 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller)
Jan 26 19:56:21 compute-0 nova_compute[183177]: 2026-01-26 19:56:21.115 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:21 compute-0 podman[211234]: 2026-01-26 19:56:21.357831919 +0000 UTC m=+0.096814434 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 19:56:21 compute-0 podman[211235]: 2026-01-26 19:56:21.3656583 +0000 UTC m=+0.086547637 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 26 19:56:23 compute-0 nova_compute[183177]: 2026-01-26 19:56:23.117 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:23 compute-0 sshd-session[211277]: Connection closed by authenticating user root 142.93.140.142 port 40648 [preauth]
Jan 26 19:56:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:24.072 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:56:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:24.072 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:56:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:24.072 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:56:26 compute-0 nova_compute[183177]: 2026-01-26 19:56:26.116 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:27 compute-0 podman[211280]: 2026-01-26 19:56:27.381679375 +0000 UTC m=+0.116811615 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 19:56:28 compute-0 nova_compute[183177]: 2026-01-26 19:56:28.120 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:30 compute-0 podman[192499]: time="2026-01-26T19:56:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:56:30 compute-0 podman[192499]: @ - - [26/Jan/2026:19:56:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:56:30 compute-0 podman[192499]: @ - - [26/Jan/2026:19:56:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Jan 26 19:56:31 compute-0 nova_compute[183177]: 2026-01-26 19:56:31.118 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:31 compute-0 openstack_network_exporter[195363]: ERROR   19:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:56:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:56:31 compute-0 openstack_network_exporter[195363]: ERROR   19:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:56:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:56:33 compute-0 nova_compute[183177]: 2026-01-26 19:56:33.123 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:36 compute-0 sshd-session[211305]: Invalid user ansible from 193.32.162.151 port 60436
Jan 26 19:56:36 compute-0 nova_compute[183177]: 2026-01-26 19:56:36.121 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:36 compute-0 sshd-session[211305]: Connection closed by invalid user ansible 193.32.162.151 port 60436 [preauth]
Jan 26 19:56:38 compute-0 nova_compute[183177]: 2026-01-26 19:56:38.125 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:39 compute-0 nova_compute[183177]: 2026-01-26 19:56:39.867 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:41 compute-0 nova_compute[183177]: 2026-01-26 19:56:41.122 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:43 compute-0 nova_compute[183177]: 2026-01-26 19:56:43.127 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:46 compute-0 nova_compute[183177]: 2026-01-26 19:56:46.127 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:48 compute-0 nova_compute[183177]: 2026-01-26 19:56:48.129 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:50 compute-0 podman[211307]: 2026-01-26 19:56:50.441710711 +0000 UTC m=+0.181124450 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:56:51 compute-0 nova_compute[183177]: 2026-01-26 19:56:51.130 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:52.135 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:a9:5d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eae384f11074574984dfe78117085bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4429b655-2f88-4c91-b9bf-1d7728eb783d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a64be302-a856-412d-b0cb-ff5bd9fb5cec) old=Port_Binding(mac=['fa:16:3e:18:a9:5d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eae384f11074574984dfe78117085bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:56:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:52.137 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a64be302-a856-412d-b0cb-ff5bd9fb5cec in datapath 4bd32c7b-f3d3-4dc7-82df-0adf92573e26 updated
Jan 26 19:56:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:52.138 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4bd32c7b-f3d3-4dc7-82df-0adf92573e26, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:56:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:52.140 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[4cdcfd75-147c-4e88-a113-f17c7b0dbc8c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:56:52 compute-0 podman[211335]: 2026-01-26 19:56:52.350405201 +0000 UTC m=+0.092142708 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter)
Jan 26 19:56:52 compute-0 podman[211336]: 2026-01-26 19:56:52.383387271 +0000 UTC m=+0.109802134 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS)
Jan 26 19:56:53 compute-0 nova_compute[183177]: 2026-01-26 19:56:53.131 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:55 compute-0 nova_compute[183177]: 2026-01-26 19:56:55.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:56:56 compute-0 nova_compute[183177]: 2026-01-26 19:56:56.132 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:56 compute-0 nova_compute[183177]: 2026-01-26 19:56:56.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:56:58 compute-0 nova_compute[183177]: 2026-01-26 19:56:58.134 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:56:58 compute-0 podman[211379]: 2026-01-26 19:56:58.341603535 +0000 UTC m=+0.074782359 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:56:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:59.058 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:52:76 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6e2b29ab-cda4-47e6-8d33-81c9aa559160', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e2b29ab-cda4-47e6-8d33-81c9aa559160', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd75b993944424869a47d42c106a38c67', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92cb772a-9311-48e5-bc40-f1f76437fdfb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=41908b8d-4531-4c1b-826d-0ebd035a12b2) old=Port_Binding(mac=['fa:16:3e:6b:52:76'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-6e2b29ab-cda4-47e6-8d33-81c9aa559160', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e2b29ab-cda4-47e6-8d33-81c9aa559160', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd75b993944424869a47d42c106a38c67', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:56:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:59.059 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 41908b8d-4531-4c1b-826d-0ebd035a12b2 in datapath 6e2b29ab-cda4-47e6-8d33-81c9aa559160 updated
Jan 26 19:56:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:59.060 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6e2b29ab-cda4-47e6-8d33-81c9aa559160, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:56:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:56:59.061 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe74e67-a47b-406e-a59e-e957f9ffdb67]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:56:59 compute-0 podman[192499]: time="2026-01-26T19:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:56:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:56:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 26 19:57:01 compute-0 nova_compute[183177]: 2026-01-26 19:57:01.134 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:01 compute-0 nova_compute[183177]: 2026-01-26 19:57:01.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:57:01 compute-0 openstack_network_exporter[195363]: ERROR   19:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:57:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:57:01 compute-0 openstack_network_exporter[195363]: ERROR   19:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:57:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:57:01 compute-0 nova_compute[183177]: 2026-01-26 19:57:01.672 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:57:01 compute-0 nova_compute[183177]: 2026-01-26 19:57:01.673 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:57:01 compute-0 nova_compute[183177]: 2026-01-26 19:57:01.673 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:57:01 compute-0 nova_compute[183177]: 2026-01-26 19:57:01.673 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:57:01 compute-0 nova_compute[183177]: 2026-01-26 19:57:01.923 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:57:01 compute-0 nova_compute[183177]: 2026-01-26 19:57:01.925 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:57:01 compute-0 nova_compute[183177]: 2026-01-26 19:57:01.964 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:57:01 compute-0 nova_compute[183177]: 2026-01-26 19:57:01.965 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5724MB free_disk=73.09867477416992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:57:01 compute-0 nova_compute[183177]: 2026-01-26 19:57:01.966 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:57:01 compute-0 nova_compute[183177]: 2026-01-26 19:57:01.966 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:57:03 compute-0 nova_compute[183177]: 2026-01-26 19:57:03.022 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:57:03 compute-0 nova_compute[183177]: 2026-01-26 19:57:03.022 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:57:01 up  1:21,  0 user,  load average: 0.13, 0.22, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:57:03 compute-0 nova_compute[183177]: 2026-01-26 19:57:03.043 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing inventories for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 19:57:03 compute-0 nova_compute[183177]: 2026-01-26 19:57:03.075 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating ProviderTree inventory for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 19:57:03 compute-0 nova_compute[183177]: 2026-01-26 19:57:03.076 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 19:57:03 compute-0 nova_compute[183177]: 2026-01-26 19:57:03.089 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing aggregate associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 19:57:03 compute-0 nova_compute[183177]: 2026-01-26 19:57:03.110 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing trait associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_SCSI,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 19:57:03 compute-0 nova_compute[183177]: 2026-01-26 19:57:03.177 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:03 compute-0 nova_compute[183177]: 2026-01-26 19:57:03.185 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:57:03 compute-0 nova_compute[183177]: 2026-01-26 19:57:03.693 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:57:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:04.196 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:57:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:04.196 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:57:04 compute-0 nova_compute[183177]: 2026-01-26 19:57:04.211 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:57:04 compute-0 nova_compute[183177]: 2026-01-26 19:57:04.212 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.246s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:57:04 compute-0 nova_compute[183177]: 2026-01-26 19:57:04.244 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:04 compute-0 sshd-session[211404]: Connection closed by authenticating user root 142.93.140.142 port 59140 [preauth]
Jan 26 19:57:05 compute-0 nova_compute[183177]: 2026-01-26 19:57:05.209 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:57:05 compute-0 nova_compute[183177]: 2026-01-26 19:57:05.209 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:57:05 compute-0 nova_compute[183177]: 2026-01-26 19:57:05.210 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:57:06 compute-0 nova_compute[183177]: 2026-01-26 19:57:06.138 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:06 compute-0 nova_compute[183177]: 2026-01-26 19:57:06.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:57:08 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:08.198 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:57:08 compute-0 nova_compute[183177]: 2026-01-26 19:57:08.205 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:08 compute-0 nova_compute[183177]: 2026-01-26 19:57:08.553 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquiring lock "3538b478-193d-4710-b409-b238c7fee35a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:57:08 compute-0 nova_compute[183177]: 2026-01-26 19:57:08.553 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:57:09 compute-0 nova_compute[183177]: 2026-01-26 19:57:09.060 183181 DEBUG nova.compute.manager [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 19:57:09 compute-0 nova_compute[183177]: 2026-01-26 19:57:09.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:57:09 compute-0 nova_compute[183177]: 2026-01-26 19:57:09.630 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:57:09 compute-0 nova_compute[183177]: 2026-01-26 19:57:09.631 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:57:09 compute-0 nova_compute[183177]: 2026-01-26 19:57:09.645 183181 DEBUG nova.virt.hardware [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 19:57:09 compute-0 nova_compute[183177]: 2026-01-26 19:57:09.645 183181 INFO nova.compute.claims [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Claim successful on node compute-0.ctlplane.example.com
Jan 26 19:57:10 compute-0 nova_compute[183177]: 2026-01-26 19:57:10.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:57:10 compute-0 nova_compute[183177]: 2026-01-26 19:57:10.758 183181 DEBUG nova.compute.provider_tree [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:57:11 compute-0 nova_compute[183177]: 2026-01-26 19:57:11.140 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:11 compute-0 nova_compute[183177]: 2026-01-26 19:57:11.271 183181 DEBUG nova.scheduler.client.report [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:57:11 compute-0 nova_compute[183177]: 2026-01-26 19:57:11.784 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.153s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:57:11 compute-0 nova_compute[183177]: 2026-01-26 19:57:11.785 183181 DEBUG nova.compute.manager [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 19:57:12 compute-0 nova_compute[183177]: 2026-01-26 19:57:12.302 183181 DEBUG nova.compute.manager [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 19:57:12 compute-0 nova_compute[183177]: 2026-01-26 19:57:12.303 183181 DEBUG nova.network.neutron [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 19:57:12 compute-0 nova_compute[183177]: 2026-01-26 19:57:12.304 183181 WARNING neutronclient.v2_0.client [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:57:12 compute-0 nova_compute[183177]: 2026-01-26 19:57:12.305 183181 WARNING neutronclient.v2_0.client [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:57:12 compute-0 nova_compute[183177]: 2026-01-26 19:57:12.817 183181 INFO nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 19:57:13 compute-0 nova_compute[183177]: 2026-01-26 19:57:13.001 183181 DEBUG nova.network.neutron [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Successfully created port: 2da42ab1-63fa-490f-96ad-85fd462fa1a4 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 19:57:13 compute-0 nova_compute[183177]: 2026-01-26 19:57:13.243 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:13 compute-0 nova_compute[183177]: 2026-01-26 19:57:13.334 183181 DEBUG nova.compute.manager [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.363 183181 DEBUG nova.compute.manager [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.366 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.366 183181 INFO nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Creating image(s)
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.367 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquiring lock "/var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.368 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "/var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.369 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "/var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.370 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.377 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.380 183181 DEBUG oslo_concurrency.processutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.482 183181 DEBUG oslo_concurrency.processutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.483 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.484 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.485 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.492 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.493 183181 DEBUG oslo_concurrency.processutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.567 183181 DEBUG oslo_concurrency.processutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.569 183181 DEBUG oslo_concurrency.processutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.626 183181 DEBUG oslo_concurrency.processutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.628 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.629 183181 DEBUG oslo_concurrency.processutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.698 183181 DEBUG oslo_concurrency.processutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.700 183181 DEBUG nova.virt.disk.api [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Checking if we can resize image /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.700 183181 DEBUG oslo_concurrency.processutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.765 183181 DEBUG oslo_concurrency.processutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.767 183181 DEBUG nova.virt.disk.api [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Cannot resize image /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.768 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.768 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Ensure instance console log exists: /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.769 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.769 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:57:14 compute-0 nova_compute[183177]: 2026-01-26 19:57:14.770 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:57:15 compute-0 nova_compute[183177]: 2026-01-26 19:57:15.063 183181 DEBUG nova.network.neutron [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Successfully updated port: 2da42ab1-63fa-490f-96ad-85fd462fa1a4 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 19:57:15 compute-0 ovn_controller[95396]: 2026-01-26T19:57:15Z|00145|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 19:57:15 compute-0 nova_compute[183177]: 2026-01-26 19:57:15.175 183181 DEBUG nova.compute.manager [req-5b49812c-d4f1-4868-a4b5-eef2146f5558 req-8d4cd396-ee78-410e-b082-fd77b22f4a5c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Received event network-changed-2da42ab1-63fa-490f-96ad-85fd462fa1a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:57:15 compute-0 nova_compute[183177]: 2026-01-26 19:57:15.176 183181 DEBUG nova.compute.manager [req-5b49812c-d4f1-4868-a4b5-eef2146f5558 req-8d4cd396-ee78-410e-b082-fd77b22f4a5c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Refreshing instance network info cache due to event network-changed-2da42ab1-63fa-490f-96ad-85fd462fa1a4. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:57:15 compute-0 nova_compute[183177]: 2026-01-26 19:57:15.176 183181 DEBUG oslo_concurrency.lockutils [req-5b49812c-d4f1-4868-a4b5-eef2146f5558 req-8d4cd396-ee78-410e-b082-fd77b22f4a5c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-3538b478-193d-4710-b409-b238c7fee35a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:57:15 compute-0 nova_compute[183177]: 2026-01-26 19:57:15.177 183181 DEBUG oslo_concurrency.lockutils [req-5b49812c-d4f1-4868-a4b5-eef2146f5558 req-8d4cd396-ee78-410e-b082-fd77b22f4a5c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-3538b478-193d-4710-b409-b238c7fee35a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:57:15 compute-0 nova_compute[183177]: 2026-01-26 19:57:15.177 183181 DEBUG nova.network.neutron [req-5b49812c-d4f1-4868-a4b5-eef2146f5558 req-8d4cd396-ee78-410e-b082-fd77b22f4a5c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Refreshing network info cache for port 2da42ab1-63fa-490f-96ad-85fd462fa1a4 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:57:15 compute-0 nova_compute[183177]: 2026-01-26 19:57:15.571 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquiring lock "refresh_cache-3538b478-193d-4710-b409-b238c7fee35a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:57:15 compute-0 nova_compute[183177]: 2026-01-26 19:57:15.687 183181 WARNING neutronclient.v2_0.client [req-5b49812c-d4f1-4868-a4b5-eef2146f5558 req-8d4cd396-ee78-410e-b082-fd77b22f4a5c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:57:15 compute-0 nova_compute[183177]: 2026-01-26 19:57:15.878 183181 DEBUG nova.network.neutron [req-5b49812c-d4f1-4868-a4b5-eef2146f5558 req-8d4cd396-ee78-410e-b082-fd77b22f4a5c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:57:16 compute-0 nova_compute[183177]: 2026-01-26 19:57:16.009 183181 DEBUG nova.network.neutron [req-5b49812c-d4f1-4868-a4b5-eef2146f5558 req-8d4cd396-ee78-410e-b082-fd77b22f4a5c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:57:16 compute-0 nova_compute[183177]: 2026-01-26 19:57:16.145 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:16 compute-0 nova_compute[183177]: 2026-01-26 19:57:16.518 183181 DEBUG oslo_concurrency.lockutils [req-5b49812c-d4f1-4868-a4b5-eef2146f5558 req-8d4cd396-ee78-410e-b082-fd77b22f4a5c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-3538b478-193d-4710-b409-b238c7fee35a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:57:16 compute-0 nova_compute[183177]: 2026-01-26 19:57:16.519 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquired lock "refresh_cache-3538b478-193d-4710-b409-b238c7fee35a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:57:16 compute-0 nova_compute[183177]: 2026-01-26 19:57:16.519 183181 DEBUG nova.network.neutron [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:57:17 compute-0 nova_compute[183177]: 2026-01-26 19:57:17.859 183181 DEBUG nova.network.neutron [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:57:18 compute-0 nova_compute[183177]: 2026-01-26 19:57:18.287 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:18 compute-0 nova_compute[183177]: 2026-01-26 19:57:18.502 183181 WARNING neutronclient.v2_0.client [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:57:18 compute-0 nova_compute[183177]: 2026-01-26 19:57:18.737 183181 DEBUG nova.network.neutron [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Updating instance_info_cache with network_info: [{"id": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "address": "fa:16:3e:db:28:eb", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da42ab1-63", "ovs_interfaceid": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.245 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Releasing lock "refresh_cache-3538b478-193d-4710-b409-b238c7fee35a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.246 183181 DEBUG nova.compute.manager [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Instance network_info: |[{"id": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "address": "fa:16:3e:db:28:eb", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da42ab1-63", "ovs_interfaceid": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.250 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Start _get_guest_xml network_info=[{"id": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "address": "fa:16:3e:db:28:eb", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da42ab1-63", "ovs_interfaceid": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.256 183181 WARNING nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.258 183181 DEBUG nova.virt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-922921175', uuid='3538b478-193d-4710-b409-b238c7fee35a'), owner=OwnerMeta(userid='9a35d70b2552448eb80c3a52422369a8', username='tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307-project-admin', projectid='d75b993944424869a47d42c106a38c67', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "address": "fa:16:3e:db:28:eb", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da42ab1-63", "ovs_interfaceid": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769457439.2587183) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.263 183181 DEBUG nova.virt.libvirt.host [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.264 183181 DEBUG nova.virt.libvirt.host [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.268 183181 DEBUG nova.virt.libvirt.host [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.268 183181 DEBUG nova.virt.libvirt.host [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.270 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.271 183181 DEBUG nova.virt.hardware [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.271 183181 DEBUG nova.virt.hardware [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.272 183181 DEBUG nova.virt.hardware [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.272 183181 DEBUG nova.virt.hardware [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.273 183181 DEBUG nova.virt.hardware [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.273 183181 DEBUG nova.virt.hardware [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.273 183181 DEBUG nova.virt.hardware [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.274 183181 DEBUG nova.virt.hardware [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.274 183181 DEBUG nova.virt.hardware [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.275 183181 DEBUG nova.virt.hardware [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.275 183181 DEBUG nova.virt.hardware [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.282 183181 DEBUG nova.virt.libvirt.vif [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:57:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-922921175',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-922921175',id=20,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d75b993944424869a47d42c106a38c67',ramdisk_id='',reservation_id='r-c0r4mjx8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:57:13Z,user_data=None,user_id='9a35d70b2552448eb80c3a52422369a8',uuid=3538b478-193d-4710-b409-b238c7fee35a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "address": "fa:16:3e:db:28:eb", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da42ab1-63", "ovs_interfaceid": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.283 183181 DEBUG nova.network.os_vif_util [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Converting VIF {"id": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "address": "fa:16:3e:db:28:eb", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da42ab1-63", "ovs_interfaceid": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.284 183181 DEBUG nova.network.os_vif_util [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:28:eb,bridge_name='br-int',has_traffic_filtering=True,id=2da42ab1-63fa-490f-96ad-85fd462fa1a4,network=Network(4bd32c7b-f3d3-4dc7-82df-0adf92573e26),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da42ab1-63') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.286 183181 DEBUG nova.objects.instance [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3538b478-193d-4710-b409-b238c7fee35a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.808 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] End _get_guest_xml xml=<domain type="kvm">
Jan 26 19:57:19 compute-0 nova_compute[183177]:   <uuid>3538b478-193d-4710-b409-b238c7fee35a</uuid>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   <name>instance-00000014</name>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   <metadata>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-922921175</nova:name>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 19:57:19</nova:creationTime>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 19:57:19 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 19:57:19 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:user uuid="9a35d70b2552448eb80c3a52422369a8">tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307-project-admin</nova:user>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:project uuid="d75b993944424869a47d42c106a38c67">tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307</nova:project>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         <nova:port uuid="2da42ab1-63fa-490f-96ad-85fd462fa1a4">
Jan 26 19:57:19 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   </metadata>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <system>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <entry name="serial">3538b478-193d-4710-b409-b238c7fee35a</entry>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <entry name="uuid">3538b478-193d-4710-b409-b238c7fee35a</entry>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     </system>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   <os>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   </os>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   <features>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <apic/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   </features>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   </clock>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   </cpu>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   <devices>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk.config"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     </disk>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:db:28:eb"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <target dev="tap2da42ab1-63"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     </interface>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/console.log" append="off"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     </serial>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <video>
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     </video>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     </rng>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 19:57:19 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 19:57:19 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 19:57:19 compute-0 nova_compute[183177]:   </devices>
Jan 26 19:57:19 compute-0 nova_compute[183177]: </domain>
Jan 26 19:57:19 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.810 183181 DEBUG nova.compute.manager [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Preparing to wait for external event network-vif-plugged-2da42ab1-63fa-490f-96ad-85fd462fa1a4 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.811 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquiring lock "3538b478-193d-4710-b409-b238c7fee35a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.811 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.812 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.813 183181 DEBUG nova.virt.libvirt.vif [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:57:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-922921175',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-922921175',id=20,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d75b993944424869a47d42c106a38c67',ramdisk_id='',reservation_id='r-c0r4mjx8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:57:13Z,user_data=None,user_id='9a35d70b2552448eb80c3a52422369a8',uuid=3538b478-193d-4710-b409-b238c7fee35a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "address": "fa:16:3e:db:28:eb", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da42ab1-63", "ovs_interfaceid": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.814 183181 DEBUG nova.network.os_vif_util [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Converting VIF {"id": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "address": "fa:16:3e:db:28:eb", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da42ab1-63", "ovs_interfaceid": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.815 183181 DEBUG nova.network.os_vif_util [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:28:eb,bridge_name='br-int',has_traffic_filtering=True,id=2da42ab1-63fa-490f-96ad-85fd462fa1a4,network=Network(4bd32c7b-f3d3-4dc7-82df-0adf92573e26),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da42ab1-63') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.815 183181 DEBUG os_vif [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:28:eb,bridge_name='br-int',has_traffic_filtering=True,id=2da42ab1-63fa-490f-96ad-85fd462fa1a4,network=Network(4bd32c7b-f3d3-4dc7-82df-0adf92573e26),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da42ab1-63') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.817 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.817 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.818 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.819 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.820 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'b442009b-1329-5ae9-8fef-61ec0438b941', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.822 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.824 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.828 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.828 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2da42ab1-63, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.829 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2da42ab1-63, col_values=(('qos', UUID('51a0ac28-dded-4b90-a6f6-36e42167526b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.829 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2da42ab1-63, col_values=(('external_ids', {'iface-id': '2da42ab1-63fa-490f-96ad-85fd462fa1a4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:28:eb', 'vm-uuid': '3538b478-193d-4710-b409-b238c7fee35a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.831 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:19 compute-0 NetworkManager[55489]: <info>  [1769457439.8327] manager: (tap2da42ab1-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.834 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.841 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:19 compute-0 nova_compute[183177]: 2026-01-26 19:57:19.842 183181 INFO os_vif [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:28:eb,bridge_name='br-int',has_traffic_filtering=True,id=2da42ab1-63fa-490f-96ad-85fd462fa1a4,network=Network(4bd32c7b-f3d3-4dc7-82df-0adf92573e26),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da42ab1-63')
Jan 26 19:57:21 compute-0 nova_compute[183177]: 2026-01-26 19:57:21.147 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:21 compute-0 podman[211424]: 2026-01-26 19:57:21.404382327 +0000 UTC m=+0.145092517 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 19:57:21 compute-0 nova_compute[183177]: 2026-01-26 19:57:21.407 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:57:21 compute-0 nova_compute[183177]: 2026-01-26 19:57:21.409 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 19:57:21 compute-0 nova_compute[183177]: 2026-01-26 19:57:21.410 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] No VIF found with MAC fa:16:3e:db:28:eb, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 19:57:21 compute-0 nova_compute[183177]: 2026-01-26 19:57:21.411 183181 INFO nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Using config drive
Jan 26 19:57:21 compute-0 nova_compute[183177]: 2026-01-26 19:57:21.925 183181 WARNING neutronclient.v2_0.client [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:57:22 compute-0 nova_compute[183177]: 2026-01-26 19:57:22.423 183181 INFO nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Creating config drive at /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk.config
Jan 26 19:57:22 compute-0 nova_compute[183177]: 2026-01-26 19:57:22.433 183181 DEBUG oslo_concurrency.processutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp677d3y7w execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:57:22 compute-0 nova_compute[183177]: 2026-01-26 19:57:22.576 183181 DEBUG oslo_concurrency.processutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp677d3y7w" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:57:22 compute-0 kernel: tap2da42ab1-63: entered promiscuous mode
Jan 26 19:57:22 compute-0 NetworkManager[55489]: <info>  [1769457442.6653] manager: (tap2da42ab1-63): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Jan 26 19:57:22 compute-0 nova_compute[183177]: 2026-01-26 19:57:22.665 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:22 compute-0 ovn_controller[95396]: 2026-01-26T19:57:22Z|00146|binding|INFO|Claiming lport 2da42ab1-63fa-490f-96ad-85fd462fa1a4 for this chassis.
Jan 26 19:57:22 compute-0 ovn_controller[95396]: 2026-01-26T19:57:22Z|00147|binding|INFO|2da42ab1-63fa-490f-96ad-85fd462fa1a4: Claiming fa:16:3e:db:28:eb 10.100.0.10
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.683 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:28:eb 10.100.0.10'], port_security=['fa:16:3e:db:28:eb 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '3538b478-193d-4710-b409-b238c7fee35a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd75b993944424869a47d42c106a38c67', 'neutron:revision_number': '4', 'neutron:security_group_ids': '89a2c834-f9c0-4d4f-90b6-d26cac037c36', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4429b655-2f88-4c91-b9bf-1d7728eb783d, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=2da42ab1-63fa-490f-96ad-85fd462fa1a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.685 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 2da42ab1-63fa-490f-96ad-85fd462fa1a4 in datapath 4bd32c7b-f3d3-4dc7-82df-0adf92573e26 bound to our chassis
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.686 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4bd32c7b-f3d3-4dc7-82df-0adf92573e26
Jan 26 19:57:22 compute-0 systemd-udevd[211488]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.700 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca6ee5f-e6d3-4b14-8418-2912f1f6c4ce]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.701 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4bd32c7b-f1 in ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.703 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4bd32c7b-f0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.703 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3aaf1883-b689-4b6c-99ac-300a3d7d43a0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.704 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[80324187-116b-4dac-b516-5f90e99b6c55]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 systemd-machined[154465]: New machine qemu-13-instance-00000014.
Jan 26 19:57:22 compute-0 NetworkManager[55489]: <info>  [1769457442.7147] device (tap2da42ab1-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:57:22 compute-0 NetworkManager[55489]: <info>  [1769457442.7154] device (tap2da42ab1-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.716 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[9528a9bb-cef7-4da3-b977-ecfcbadb44f8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 nova_compute[183177]: 2026-01-26 19:57:22.724 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:22 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000014.
Jan 26 19:57:22 compute-0 nova_compute[183177]: 2026-01-26 19:57:22.728 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:22 compute-0 ovn_controller[95396]: 2026-01-26T19:57:22Z|00148|binding|INFO|Setting lport 2da42ab1-63fa-490f-96ad-85fd462fa1a4 ovn-installed in OVS
Jan 26 19:57:22 compute-0 ovn_controller[95396]: 2026-01-26T19:57:22Z|00149|binding|INFO|Setting lport 2da42ab1-63fa-490f-96ad-85fd462fa1a4 up in Southbound
Jan 26 19:57:22 compute-0 nova_compute[183177]: 2026-01-26 19:57:22.732 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.733 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b56f61b4-c1f0-4ce1-8539-9fd08c3d3cfe]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 podman[211460]: 2026-01-26 19:57:22.750238004 +0000 UTC m=+0.101483440 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, vcs-type=git, config_id=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 26 19:57:22 compute-0 podman[211462]: 2026-01-26 19:57:22.775336282 +0000 UTC m=+0.106867156 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest)
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.777 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[44fa8fb2-4420-44a7-b566-87bc4d8a6014]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.781 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[15cabee8-9bf8-4b4a-b8de-506495eb8cf0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 NetworkManager[55489]: <info>  [1769457442.7828] manager: (tap4bd32c7b-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Jan 26 19:57:22 compute-0 systemd-udevd[211499]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.818 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[c49701b9-07b9-4582-83f3-d9d60a736e88]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.821 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[31f8c673-39c4-4856-94f1-49c62d297636]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 NetworkManager[55489]: <info>  [1769457442.8489] device (tap4bd32c7b-f0): carrier: link connected
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.858 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[bc791055-95e2-4876-ba7f-7a559e9f091c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.881 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f5b461e0-b4f5-4a00-9920-456dfa0377bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4bd32c7b-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:a9:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490173, 'reachable_time': 37567, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211540, 'error': None, 'target': 'ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.901 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[143cf3ea-0f00-44f9-b84b-5bcf903f567d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe18:a95d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490173, 'tstamp': 490173}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211541, 'error': None, 'target': 'ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.922 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9da6e7eb-8ec7-4a83-b713-44ff30ab0201]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4bd32c7b-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:a9:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490173, 'reachable_time': 37567, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211542, 'error': None, 'target': 'ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:22.963 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1452dd86-5a87-4ad1-b8da-ecf7538c5fec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:22 compute-0 nova_compute[183177]: 2026-01-26 19:57:22.994 183181 DEBUG nova.compute.manager [req-e32d9b5f-3651-44aa-9316-9622eef5d758 req-d1b575aa-b7d0-42c0-833e-598a591eff99 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Received event network-vif-plugged-2da42ab1-63fa-490f-96ad-85fd462fa1a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:57:22 compute-0 nova_compute[183177]: 2026-01-26 19:57:22.995 183181 DEBUG oslo_concurrency.lockutils [req-e32d9b5f-3651-44aa-9316-9622eef5d758 req-d1b575aa-b7d0-42c0-833e-598a591eff99 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "3538b478-193d-4710-b409-b238c7fee35a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:57:22 compute-0 nova_compute[183177]: 2026-01-26 19:57:22.995 183181 DEBUG oslo_concurrency.lockutils [req-e32d9b5f-3651-44aa-9316-9622eef5d758 req-d1b575aa-b7d0-42c0-833e-598a591eff99 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:57:22 compute-0 nova_compute[183177]: 2026-01-26 19:57:22.995 183181 DEBUG oslo_concurrency.lockutils [req-e32d9b5f-3651-44aa-9316-9622eef5d758 req-d1b575aa-b7d0-42c0-833e-598a591eff99 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:57:22 compute-0 nova_compute[183177]: 2026-01-26 19:57:22.996 183181 DEBUG nova.compute.manager [req-e32d9b5f-3651-44aa-9316-9622eef5d758 req-d1b575aa-b7d0-42c0-833e-598a591eff99 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Processing event network-vif-plugged-2da42ab1-63fa-490f-96ad-85fd462fa1a4 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.061 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[40e56d26-6289-4cd7-8657-24f3c66e2e06]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.063 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bd32c7b-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.064 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.064 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4bd32c7b-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:57:23 compute-0 NetworkManager[55489]: <info>  [1769457443.0998] manager: (tap4bd32c7b-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Jan 26 19:57:23 compute-0 kernel: tap4bd32c7b-f0: entered promiscuous mode
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.098 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.103 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4bd32c7b-f0, col_values=(('external_ids', {'iface-id': 'a64be302-a856-412d-b0cb-ff5bd9fb5cec'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.104 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:23 compute-0 ovn_controller[95396]: 2026-01-26T19:57:23Z|00150|binding|INFO|Releasing lport a64be302-a856-412d-b0cb-ff5bd9fb5cec from this chassis (sb_readonly=0)
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.105 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.107 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ca1736-3d29-4171-9af0-fdb66368bfa7]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.108 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.109 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.109 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 4bd32c7b-f3d3-4dc7-82df-0adf92573e26 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.109 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.109 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2fc7af46-bf4c-4364-aa44-6ae0bf9cdbb0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.110 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.110 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[cb07ea9e-ed95-4cf4-b54b-cc9316f6dec5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.111 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: global
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-4bd32c7b-f3d3-4dc7-82df-0adf92573e26
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.pid.haproxy
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID 4bd32c7b-f3d3-4dc7-82df-0adf92573e26
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 19:57:23 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:23.112 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'env', 'PROCESS_TAG=haproxy-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.119 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.202 183181 DEBUG nova.compute.manager [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.208 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.213 183181 INFO nova.virt.libvirt.driver [-] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Instance spawned successfully.
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.213 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 19:57:23 compute-0 podman[211581]: 2026-01-26 19:57:23.608884471 +0000 UTC m=+0.094595784 container create 86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120)
Jan 26 19:57:23 compute-0 podman[211581]: 2026-01-26 19:57:23.564307678 +0000 UTC m=+0.050019001 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 19:57:23 compute-0 systemd[1]: Started libpod-conmon-86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c.scope.
Jan 26 19:57:23 compute-0 systemd[1]: Started libcrun container.
Jan 26 19:57:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7f5e34b5e1e0b47311717dfa263579a459c02fb6de46d8e559579bb3b7b5f32/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.730 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.733 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.734 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.735 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.736 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:57:23 compute-0 nova_compute[183177]: 2026-01-26 19:57:23.737 183181 DEBUG nova.virt.libvirt.driver [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 19:57:23 compute-0 podman[211581]: 2026-01-26 19:57:23.741560912 +0000 UTC m=+0.227272215 container init 86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120)
Jan 26 19:57:23 compute-0 podman[211581]: 2026-01-26 19:57:23.752395005 +0000 UTC m=+0.238106288 container start 86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Jan 26 19:57:23 compute-0 neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26[211596]: [NOTICE]   (211600) : New worker (211602) forked
Jan 26 19:57:23 compute-0 neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26[211596]: [NOTICE]   (211600) : Loading success.
Jan 26 19:57:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:24.074 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:57:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:24.075 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:57:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:57:24.075 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:57:24 compute-0 nova_compute[183177]: 2026-01-26 19:57:24.249 183181 INFO nova.compute.manager [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Took 9.89 seconds to spawn the instance on the hypervisor.
Jan 26 19:57:24 compute-0 nova_compute[183177]: 2026-01-26 19:57:24.250 183181 DEBUG nova.compute.manager [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 19:57:24 compute-0 nova_compute[183177]: 2026-01-26 19:57:24.788 183181 INFO nova.compute.manager [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Took 15.21 seconds to build instance.
Jan 26 19:57:24 compute-0 nova_compute[183177]: 2026-01-26 19:57:24.831 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:25 compute-0 nova_compute[183177]: 2026-01-26 19:57:25.092 183181 DEBUG nova.compute.manager [req-ac1c0287-006e-4bc2-8bdc-2e4b52d70960 req-45b43689-d6f6-42f9-a900-e49af0432838 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Received event network-vif-plugged-2da42ab1-63fa-490f-96ad-85fd462fa1a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:57:25 compute-0 nova_compute[183177]: 2026-01-26 19:57:25.094 183181 DEBUG oslo_concurrency.lockutils [req-ac1c0287-006e-4bc2-8bdc-2e4b52d70960 req-45b43689-d6f6-42f9-a900-e49af0432838 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "3538b478-193d-4710-b409-b238c7fee35a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:57:25 compute-0 nova_compute[183177]: 2026-01-26 19:57:25.094 183181 DEBUG oslo_concurrency.lockutils [req-ac1c0287-006e-4bc2-8bdc-2e4b52d70960 req-45b43689-d6f6-42f9-a900-e49af0432838 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:57:25 compute-0 nova_compute[183177]: 2026-01-26 19:57:25.094 183181 DEBUG oslo_concurrency.lockutils [req-ac1c0287-006e-4bc2-8bdc-2e4b52d70960 req-45b43689-d6f6-42f9-a900-e49af0432838 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:57:25 compute-0 nova_compute[183177]: 2026-01-26 19:57:25.095 183181 DEBUG nova.compute.manager [req-ac1c0287-006e-4bc2-8bdc-2e4b52d70960 req-45b43689-d6f6-42f9-a900-e49af0432838 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] No waiting events found dispatching network-vif-plugged-2da42ab1-63fa-490f-96ad-85fd462fa1a4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:57:25 compute-0 nova_compute[183177]: 2026-01-26 19:57:25.095 183181 WARNING nova.compute.manager [req-ac1c0287-006e-4bc2-8bdc-2e4b52d70960 req-45b43689-d6f6-42f9-a900-e49af0432838 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Received unexpected event network-vif-plugged-2da42ab1-63fa-490f-96ad-85fd462fa1a4 for instance with vm_state active and task_state None.
Jan 26 19:57:25 compute-0 nova_compute[183177]: 2026-01-26 19:57:25.294 183181 DEBUG oslo_concurrency.lockutils [None req-4d52edda-173d-4cca-baf1-99a30d247b06 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.741s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:57:26 compute-0 nova_compute[183177]: 2026-01-26 19:57:26.153 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:27 compute-0 sshd-session[211612]: Connection closed by authenticating user root 188.166.116.149 port 53072 [preauth]
Jan 26 19:57:29 compute-0 podman[211614]: 2026-01-26 19:57:29.358940156 +0000 UTC m=+0.094164832 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 19:57:29 compute-0 podman[192499]: time="2026-01-26T19:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:57:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:57:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Jan 26 19:57:29 compute-0 nova_compute[183177]: 2026-01-26 19:57:29.856 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:31 compute-0 nova_compute[183177]: 2026-01-26 19:57:31.326 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:31 compute-0 openstack_network_exporter[195363]: ERROR   19:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:57:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:57:31 compute-0 openstack_network_exporter[195363]: ERROR   19:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:57:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:57:34 compute-0 nova_compute[183177]: 2026-01-26 19:57:34.859 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:35 compute-0 ovn_controller[95396]: 2026-01-26T19:57:35Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:db:28:eb 10.100.0.10
Jan 26 19:57:35 compute-0 ovn_controller[95396]: 2026-01-26T19:57:35Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:db:28:eb 10.100.0.10
Jan 26 19:57:36 compute-0 nova_compute[183177]: 2026-01-26 19:57:36.359 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:39 compute-0 nova_compute[183177]: 2026-01-26 19:57:39.863 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:41 compute-0 nova_compute[183177]: 2026-01-26 19:57:41.374 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:42 compute-0 sshd-session[211661]: Connection closed by authenticating user root 142.93.140.142 port 59386 [preauth]
Jan 26 19:57:44 compute-0 nova_compute[183177]: 2026-01-26 19:57:44.867 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:46 compute-0 nova_compute[183177]: 2026-01-26 19:57:46.403 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:49 compute-0 nova_compute[183177]: 2026-01-26 19:57:49.871 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:51 compute-0 nova_compute[183177]: 2026-01-26 19:57:51.406 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:52 compute-0 nova_compute[183177]: 2026-01-26 19:57:52.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:57:52 compute-0 nova_compute[183177]: 2026-01-26 19:57:52.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 19:57:52 compute-0 podman[211664]: 2026-01-26 19:57:52.438764377 +0000 UTC m=+0.170202655 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4)
Jan 26 19:57:52 compute-0 nova_compute[183177]: 2026-01-26 19:57:52.663 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 19:57:53 compute-0 podman[211692]: 2026-01-26 19:57:53.317169987 +0000 UTC m=+0.061253744 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest)
Jan 26 19:57:53 compute-0 podman[211691]: 2026-01-26 19:57:53.330914928 +0000 UTC m=+0.086628659 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, release=1755695350, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 19:57:54 compute-0 nova_compute[183177]: 2026-01-26 19:57:54.874 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:56 compute-0 nova_compute[183177]: 2026-01-26 19:57:56.410 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:57:57 compute-0 nova_compute[183177]: 2026-01-26 19:57:57.663 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:57:57 compute-0 nova_compute[183177]: 2026-01-26 19:57:57.664 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:57:59 compute-0 podman[192499]: time="2026-01-26T19:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:57:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:57:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2645 "" "Go-http-client/1.1"
Jan 26 19:57:59 compute-0 nova_compute[183177]: 2026-01-26 19:57:59.876 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:00 compute-0 podman[211730]: 2026-01-26 19:58:00.345076762 +0000 UTC m=+0.076616470 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 19:58:01 compute-0 openstack_network_exporter[195363]: ERROR   19:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:58:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:58:01 compute-0 nova_compute[183177]: 2026-01-26 19:58:01.455 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:01 compute-0 openstack_network_exporter[195363]: ERROR   19:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:58:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:58:03 compute-0 nova_compute[183177]: 2026-01-26 19:58:03.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:58:03 compute-0 nova_compute[183177]: 2026-01-26 19:58:03.675 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:58:03 compute-0 nova_compute[183177]: 2026-01-26 19:58:03.676 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:58:03 compute-0 nova_compute[183177]: 2026-01-26 19:58:03.676 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:58:03 compute-0 nova_compute[183177]: 2026-01-26 19:58:03.676 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:58:04 compute-0 nova_compute[183177]: 2026-01-26 19:58:04.748 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:58:04 compute-0 nova_compute[183177]: 2026-01-26 19:58:04.837 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:58:04 compute-0 nova_compute[183177]: 2026-01-26 19:58:04.838 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:58:04 compute-0 nova_compute[183177]: 2026-01-26 19:58:04.879 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:04 compute-0 nova_compute[183177]: 2026-01-26 19:58:04.898 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:58:05 compute-0 nova_compute[183177]: 2026-01-26 19:58:05.095 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:58:05 compute-0 nova_compute[183177]: 2026-01-26 19:58:05.097 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:58:05 compute-0 nova_compute[183177]: 2026-01-26 19:58:05.131 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:58:05 compute-0 nova_compute[183177]: 2026-01-26 19:58:05.132 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5554MB free_disk=73.07001495361328GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:58:05 compute-0 nova_compute[183177]: 2026-01-26 19:58:05.133 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:58:05 compute-0 nova_compute[183177]: 2026-01-26 19:58:05.133 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:58:06 compute-0 nova_compute[183177]: 2026-01-26 19:58:06.190 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 3538b478-193d-4710-b409-b238c7fee35a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:58:06 compute-0 nova_compute[183177]: 2026-01-26 19:58:06.191 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:58:06 compute-0 nova_compute[183177]: 2026-01-26 19:58:06.191 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:58:05 up  1:22,  0 user,  load average: 0.26, 0.24, 0.29\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_d75b993944424869a47d42c106a38c67': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:58:06 compute-0 nova_compute[183177]: 2026-01-26 19:58:06.237 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:58:06 compute-0 nova_compute[183177]: 2026-01-26 19:58:06.496 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:06 compute-0 nova_compute[183177]: 2026-01-26 19:58:06.744 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:58:07 compute-0 nova_compute[183177]: 2026-01-26 19:58:07.255 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:58:07 compute-0 nova_compute[183177]: 2026-01-26 19:58:07.255 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.122s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:58:07 compute-0 nova_compute[183177]: 2026-01-26 19:58:07.256 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:58:07 compute-0 nova_compute[183177]: 2026-01-26 19:58:07.256 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 19:58:08 compute-0 nova_compute[183177]: 2026-01-26 19:58:08.758 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:58:08 compute-0 nova_compute[183177]: 2026-01-26 19:58:08.758 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:58:08 compute-0 nova_compute[183177]: 2026-01-26 19:58:08.759 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:58:08 compute-0 nova_compute[183177]: 2026-01-26 19:58:08.759 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:58:09 compute-0 nova_compute[183177]: 2026-01-26 19:58:09.882 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:10 compute-0 nova_compute[183177]: 2026-01-26 19:58:10.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:58:10 compute-0 sshd-session[211762]: Connection closed by authenticating user root 188.166.116.149 port 45612 [preauth]
Jan 26 19:58:11 compute-0 nova_compute[183177]: 2026-01-26 19:58:11.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:58:11 compute-0 nova_compute[183177]: 2026-01-26 19:58:11.497 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:12 compute-0 sshd-session[211764]: Connection closed by authenticating user root 217.71.201.142 port 42960 [preauth]
Jan 26 19:58:13 compute-0 sshd-session[211766]: Connection closed by authenticating user root 217.71.201.142 port 42972 [preauth]
Jan 26 19:58:14 compute-0 ovn_controller[95396]: 2026-01-26T19:58:14Z|00151|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 19:58:14 compute-0 sshd-session[211768]: Connection closed by authenticating user root 217.71.201.142 port 42976 [preauth]
Jan 26 19:58:14 compute-0 sshd-session[211770]: Connection closed by authenticating user root 217.71.201.142 port 42982 [preauth]
Jan 26 19:58:14 compute-0 nova_compute[183177]: 2026-01-26 19:58:14.885 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:15 compute-0 sshd-session[211772]: Connection closed by authenticating user root 217.71.201.142 port 42988 [preauth]
Jan 26 19:58:16 compute-0 nova_compute[183177]: 2026-01-26 19:58:16.500 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:17 compute-0 sshd-session[211774]: Connection closed by authenticating user root 217.71.201.142 port 42992 [preauth]
Jan 26 19:58:17 compute-0 nova_compute[183177]: 2026-01-26 19:58:17.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:58:17 compute-0 sshd-session[211776]: Connection closed by authenticating user root 217.71.201.142 port 42996 [preauth]
Jan 26 19:58:18 compute-0 sshd-session[211778]: Connection closed by authenticating user root 217.71.201.142 port 43000 [preauth]
Jan 26 19:58:18 compute-0 sshd-session[211780]: Connection closed by authenticating user root 217.71.201.142 port 43008 [preauth]
Jan 26 19:58:19 compute-0 sshd-session[211782]: Connection closed by authenticating user root 217.71.201.142 port 43010 [preauth]
Jan 26 19:58:19 compute-0 nova_compute[183177]: 2026-01-26 19:58:19.889 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:20 compute-0 sshd-session[211784]: Connection closed by authenticating user root 217.71.201.142 port 43014 [preauth]
Jan 26 19:58:20 compute-0 nova_compute[183177]: 2026-01-26 19:58:20.585 183181 DEBUG nova.virt.libvirt.driver [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Creating tmpfile /var/lib/nova/instances/tmp4dp7jytc to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Jan 26 19:58:20 compute-0 nova_compute[183177]: 2026-01-26 19:58:20.588 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:20 compute-0 nova_compute[183177]: 2026-01-26 19:58:20.695 183181 DEBUG nova.compute.manager [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4dp7jytc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Jan 26 19:58:21 compute-0 nova_compute[183177]: 2026-01-26 19:58:21.502 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:21 compute-0 sshd-session[211786]: Connection closed by authenticating user root 142.93.140.142 port 49220 [preauth]
Jan 26 19:58:21 compute-0 sshd-session[211788]: Connection closed by authenticating user root 217.71.201.142 port 43020 [preauth]
Jan 26 19:58:22 compute-0 nova_compute[183177]: 2026-01-26 19:58:22.749 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:23 compute-0 podman[211792]: 2026-01-26 19:58:23.413228075 +0000 UTC m=+0.150908844 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, container_name=ovn_controller)
Jan 26 19:58:23 compute-0 podman[211820]: 2026-01-26 19:58:23.53825797 +0000 UTC m=+0.066809445 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 19:58:23 compute-0 sshd-session[211790]: Connection closed by authenticating user root 217.71.201.142 port 43032 [preauth]
Jan 26 19:58:23 compute-0 podman[211819]: 2026-01-26 19:58:23.564411926 +0000 UTC m=+0.109389434 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Jan 26 19:58:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:24.077 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:58:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:24.077 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:58:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:24.078 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:58:24 compute-0 nova_compute[183177]: 2026-01-26 19:58:24.927 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:25 compute-0 sshd-session[211861]: Connection closed by authenticating user root 217.71.201.142 port 43038 [preauth]
Jan 26 19:58:26 compute-0 nova_compute[183177]: 2026-01-26 19:58:26.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:58:26 compute-0 nova_compute[183177]: 2026-01-26 19:58:26.505 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:26 compute-0 nova_compute[183177]: 2026-01-26 19:58:26.753 183181 DEBUG nova.compute.manager [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4dp7jytc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='627aba4c-49d0-4c24-ba4b-ef4bd4843932',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Jan 26 19:58:26 compute-0 sshd-session[211863]: Connection closed by authenticating user root 217.71.201.142 port 43042 [preauth]
Jan 26 19:58:27 compute-0 sshd-session[211865]: Connection closed by authenticating user root 217.71.201.142 port 43056 [preauth]
Jan 26 19:58:27 compute-0 nova_compute[183177]: 2026-01-26 19:58:27.770 183181 DEBUG oslo_concurrency.lockutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-627aba4c-49d0-4c24-ba4b-ef4bd4843932" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:58:27 compute-0 nova_compute[183177]: 2026-01-26 19:58:27.770 183181 DEBUG oslo_concurrency.lockutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-627aba4c-49d0-4c24-ba4b-ef4bd4843932" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:58:27 compute-0 nova_compute[183177]: 2026-01-26 19:58:27.771 183181 DEBUG nova.network.neutron [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:58:28 compute-0 nova_compute[183177]: 2026-01-26 19:58:28.283 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:29 compute-0 sshd-session[211867]: Connection closed by authenticating user root 217.71.201.142 port 43060 [preauth]
Jan 26 19:58:29 compute-0 nova_compute[183177]: 2026-01-26 19:58:29.295 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:29 compute-0 nova_compute[183177]: 2026-01-26 19:58:29.462 183181 DEBUG nova.network.neutron [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Updating instance_info_cache with network_info: [{"id": "7c52c835-1a73-4fe5-bfd5-4754a6b58543", "address": "fa:16:3e:6d:ff:c7", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c52c835-1a", "ovs_interfaceid": "7c52c835-1a73-4fe5-bfd5-4754a6b58543", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:58:29 compute-0 podman[192499]: time="2026-01-26T19:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:58:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:58:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2646 "" "Go-http-client/1.1"
Jan 26 19:58:29 compute-0 nova_compute[183177]: 2026-01-26 19:58:29.972 183181 DEBUG oslo_concurrency.lockutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-627aba4c-49d0-4c24-ba4b-ef4bd4843932" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:58:29 compute-0 nova_compute[183177]: 2026-01-26 19:58:29.975 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:29 compute-0 nova_compute[183177]: 2026-01-26 19:58:29.992 183181 DEBUG nova.virt.libvirt.driver [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4dp7jytc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='627aba4c-49d0-4c24-ba4b-ef4bd4843932',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Jan 26 19:58:29 compute-0 nova_compute[183177]: 2026-01-26 19:58:29.993 183181 DEBUG nova.virt.libvirt.driver [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Creating instance directory: /var/lib/nova/instances/627aba4c-49d0-4c24-ba4b-ef4bd4843932 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Jan 26 19:58:29 compute-0 nova_compute[183177]: 2026-01-26 19:58:29.993 183181 DEBUG nova.virt.libvirt.driver [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Creating disk.info with the contents: {'/var/lib/nova/instances/627aba4c-49d0-4c24-ba4b-ef4bd4843932/disk': 'qcow2', '/var/lib/nova/instances/627aba4c-49d0-4c24-ba4b-ef4bd4843932/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Jan 26 19:58:29 compute-0 nova_compute[183177]: 2026-01-26 19:58:29.994 183181 DEBUG nova.virt.libvirt.driver [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Jan 26 19:58:29 compute-0 nova_compute[183177]: 2026-01-26 19:58:29.995 183181 DEBUG nova.objects.instance [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 627aba4c-49d0-4c24-ba4b-ef4bd4843932 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.502 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.505 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.508 183181 DEBUG oslo_concurrency.processutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.588 183181 DEBUG oslo_concurrency.processutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.589 183181 DEBUG oslo_concurrency.lockutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.590 183181 DEBUG oslo_concurrency.lockutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.590 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.593 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.593 183181 DEBUG oslo_concurrency.processutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.655 183181 DEBUG oslo_concurrency.processutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.656 183181 DEBUG oslo_concurrency.processutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/627aba4c-49d0-4c24-ba4b-ef4bd4843932/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.688 183181 DEBUG oslo_concurrency.processutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/627aba4c-49d0-4c24-ba4b-ef4bd4843932/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.689 183181 DEBUG oslo_concurrency.lockutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.689 183181 DEBUG oslo_concurrency.processutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.780 183181 DEBUG oslo_concurrency.processutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.781 183181 DEBUG nova.virt.disk.api [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Checking if we can resize image /var/lib/nova/instances/627aba4c-49d0-4c24-ba4b-ef4bd4843932/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.782 183181 DEBUG oslo_concurrency.processutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/627aba4c-49d0-4c24-ba4b-ef4bd4843932/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.846 183181 DEBUG oslo_concurrency.processutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/627aba4c-49d0-4c24-ba4b-ef4bd4843932/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.848 183181 DEBUG nova.virt.disk.api [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Cannot resize image /var/lib/nova/instances/627aba4c-49d0-4c24-ba4b-ef4bd4843932/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:58:30 compute-0 nova_compute[183177]: 2026-01-26 19:58:30.848 183181 DEBUG nova.objects.instance [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid 627aba4c-49d0-4c24-ba4b-ef4bd4843932 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:58:31 compute-0 podman[211884]: 2026-01-26 19:58:31.350723314 +0000 UTC m=+0.093970247 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.357 183181 DEBUG nova.objects.base [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Object Instance<627aba4c-49d0-4c24-ba4b-ef4bd4843932> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.358 183181 DEBUG oslo_concurrency.processutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/627aba4c-49d0-4c24-ba4b-ef4bd4843932/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.401 183181 DEBUG oslo_concurrency.processutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/627aba4c-49d0-4c24-ba4b-ef4bd4843932/disk.config 497664" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.402 183181 DEBUG nova.virt.libvirt.driver [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.404 183181 DEBUG nova.virt.libvirt.vif [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-26T19:57:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1069258687',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1069258687',id=21,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:57:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d75b993944424869a47d42c106a38c67',ramdisk_id='',reservation_id='r-ys7zwnp5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:57:46Z,user_data=None,user_id='9a35d70b2552448eb80c3a52422369a8',uuid=627aba4c-49d0-4c24-ba4b-ef4bd4843932,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7c52c835-1a73-4fe5-bfd5-4754a6b58543", "address": "fa:16:3e:6d:ff:c7", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7c52c835-1a", "ovs_interfaceid": "7c52c835-1a73-4fe5-bfd5-4754a6b58543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.404 183181 DEBUG nova.network.os_vif_util [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "7c52c835-1a73-4fe5-bfd5-4754a6b58543", "address": "fa:16:3e:6d:ff:c7", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7c52c835-1a", "ovs_interfaceid": "7c52c835-1a73-4fe5-bfd5-4754a6b58543", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.405 183181 DEBUG nova.network.os_vif_util [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ff:c7,bridge_name='br-int',has_traffic_filtering=True,id=7c52c835-1a73-4fe5-bfd5-4754a6b58543,network=Network(4bd32c7b-f3d3-4dc7-82df-0adf92573e26),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c52c835-1a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.406 183181 DEBUG os_vif [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ff:c7,bridge_name='br-int',has_traffic_filtering=True,id=7c52c835-1a73-4fe5-bfd5-4754a6b58543,network=Network(4bd32c7b-f3d3-4dc7-82df-0adf92573e26),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c52c835-1a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.407 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.407 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.408 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.409 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.409 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '4b2c4fc6-6af9-5255-9b67-b1804c649be9', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.411 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.413 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:31 compute-0 openstack_network_exporter[195363]: ERROR   19:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:58:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.418 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:31 compute-0 openstack_network_exporter[195363]: ERROR   19:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:58:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.419 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7c52c835-1a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.420 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap7c52c835-1a, col_values=(('qos', UUID('a00d0e22-6908-43bd-be81-938a5d05085d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.421 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap7c52c835-1a, col_values=(('external_ids', {'iface-id': '7c52c835-1a73-4fe5-bfd5-4754a6b58543', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:ff:c7', 'vm-uuid': '627aba4c-49d0-4c24-ba4b-ef4bd4843932'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.423 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:31 compute-0 NetworkManager[55489]: <info>  [1769457511.4243] manager: (tap7c52c835-1a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.427 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.436 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.437 183181 INFO os_vif [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:ff:c7,bridge_name='br-int',has_traffic_filtering=True,id=7c52c835-1a73-4fe5-bfd5-4754a6b58543,network=Network(4bd32c7b-f3d3-4dc7-82df-0adf92573e26),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c52c835-1a')
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.438 183181 DEBUG nova.virt.libvirt.driver [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.439 183181 DEBUG nova.compute.manager [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4dp7jytc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='627aba4c-49d0-4c24-ba4b-ef4bd4843932',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.440 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.509 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:31 compute-0 nova_compute[183177]: 2026-01-26 19:58:31.884 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:32 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:32.640 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:58:32 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:32.641 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:58:32 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:32.642 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:32 compute-0 nova_compute[183177]: 2026-01-26 19:58:32.675 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:32 compute-0 sshd-session[211913]: Connection closed by authenticating user root 217.71.201.142 port 43072 [preauth]
Jan 26 19:58:33 compute-0 sshd-session[211916]: Connection closed by authenticating user root 217.71.201.142 port 43080 [preauth]
Jan 26 19:58:33 compute-0 nova_compute[183177]: 2026-01-26 19:58:33.829 183181 DEBUG nova.network.neutron [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Port 7c52c835-1a73-4fe5-bfd5-4754a6b58543 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Jan 26 19:58:33 compute-0 nova_compute[183177]: 2026-01-26 19:58:33.844 183181 DEBUG nova.compute.manager [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4dp7jytc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='627aba4c-49d0-4c24-ba4b-ef4bd4843932',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Jan 26 19:58:36 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 26 19:58:36 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 26 19:58:36 compute-0 nova_compute[183177]: 2026-01-26 19:58:36.468 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:36 compute-0 nova_compute[183177]: 2026-01-26 19:58:36.511 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:36 compute-0 kernel: tap7c52c835-1a: entered promiscuous mode
Jan 26 19:58:36 compute-0 NetworkManager[55489]: <info>  [1769457516.6499] manager: (tap7c52c835-1a): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Jan 26 19:58:36 compute-0 ovn_controller[95396]: 2026-01-26T19:58:36Z|00152|binding|INFO|Claiming lport 7c52c835-1a73-4fe5-bfd5-4754a6b58543 for this additional chassis.
Jan 26 19:58:36 compute-0 nova_compute[183177]: 2026-01-26 19:58:36.651 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:36 compute-0 ovn_controller[95396]: 2026-01-26T19:58:36Z|00153|binding|INFO|7c52c835-1a73-4fe5-bfd5-4754a6b58543: Claiming fa:16:3e:6d:ff:c7 10.100.0.13
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.663 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:ff:c7 10.100.0.13'], port_security=['fa:16:3e:6d:ff:c7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '627aba4c-49d0-4c24-ba4b-ef4bd4843932', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd75b993944424869a47d42c106a38c67', 'neutron:revision_number': '10', 'neutron:security_group_ids': '89a2c834-f9c0-4d4f-90b6-d26cac037c36', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4429b655-2f88-4c91-b9bf-1d7728eb783d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=7c52c835-1a73-4fe5-bfd5-4754a6b58543) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.664 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 7c52c835-1a73-4fe5-bfd5-4754a6b58543 in datapath 4bd32c7b-f3d3-4dc7-82df-0adf92573e26 unbound from our chassis
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.666 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4bd32c7b-f3d3-4dc7-82df-0adf92573e26
Jan 26 19:58:36 compute-0 ovn_controller[95396]: 2026-01-26T19:58:36Z|00154|binding|INFO|Setting lport 7c52c835-1a73-4fe5-bfd5-4754a6b58543 ovn-installed in OVS
Jan 26 19:58:36 compute-0 nova_compute[183177]: 2026-01-26 19:58:36.683 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:36 compute-0 nova_compute[183177]: 2026-01-26 19:58:36.685 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:36 compute-0 nova_compute[183177]: 2026-01-26 19:58:36.690 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.693 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[49c7d614-b852-40bf-803c-fb4660c817a0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:36 compute-0 systemd-udevd[211954]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 19:58:36 compute-0 systemd-machined[154465]: New machine qemu-14-instance-00000015.
Jan 26 19:58:36 compute-0 NetworkManager[55489]: <info>  [1769457516.7251] device (tap7c52c835-1a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 19:58:36 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000015.
Jan 26 19:58:36 compute-0 NetworkManager[55489]: <info>  [1769457516.7263] device (tap7c52c835-1a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.743 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[cdb64ea8-0372-4e5b-8935-25672be5eaa7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.747 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[50d567d9-1250-4d17-a6b1-9c7f06cb348f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.788 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[90be4fed-01e6-4076-a018-08a77c85b5a7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.818 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d4fa8677-d9c3-4614-9dae-b8231c048be9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4bd32c7b-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:a9:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490173, 'reachable_time': 37567, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211967, 'error': None, 'target': 'ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.848 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3e443afc-82c8-40d9-a662-6ca4b5c2316f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4bd32c7b-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490189, 'tstamp': 490189}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211968, 'error': None, 'target': 'ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4bd32c7b-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490193, 'tstamp': 490193}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211968, 'error': None, 'target': 'ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.850 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bd32c7b-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:36 compute-0 nova_compute[183177]: 2026-01-26 19:58:36.852 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:36 compute-0 nova_compute[183177]: 2026-01-26 19:58:36.853 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.854 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4bd32c7b-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.854 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.854 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4bd32c7b-f0, col_values=(('external_ids', {'iface-id': 'a64be302-a856-412d-b0cb-ff5bd9fb5cec'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.855 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:58:36 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:36.857 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[859b37d9-e5e7-4a6a-bfda-2533b2a7a46f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-4bd32c7b-f3d3-4dc7-82df-0adf92573e26\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 4bd32c7b-f3d3-4dc7-82df-0adf92573e26\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:37 compute-0 sshd-session[211944]: Connection closed by authenticating user root 217.71.201.142 port 43082 [preauth]
Jan 26 19:58:37 compute-0 sshd-session[211991]: Connection closed by authenticating user root 217.71.201.142 port 43100 [preauth]
Jan 26 19:58:39 compute-0 ovn_controller[95396]: 2026-01-26T19:58:39Z|00155|binding|INFO|Claiming lport 7c52c835-1a73-4fe5-bfd5-4754a6b58543 for this chassis.
Jan 26 19:58:39 compute-0 ovn_controller[95396]: 2026-01-26T19:58:39Z|00156|binding|INFO|7c52c835-1a73-4fe5-bfd5-4754a6b58543: Claiming fa:16:3e:6d:ff:c7 10.100.0.13
Jan 26 19:58:39 compute-0 ovn_controller[95396]: 2026-01-26T19:58:39Z|00157|binding|INFO|Setting lport 7c52c835-1a73-4fe5-bfd5-4754a6b58543 up in Southbound
Jan 26 19:58:39 compute-0 sshd-session[212007]: Connection closed by authenticating user root 217.71.201.142 port 43104 [preauth]
Jan 26 19:58:40 compute-0 sshd-session[212009]: Connection closed by authenticating user root 217.71.201.142 port 43112 [preauth]
Jan 26 19:58:40 compute-0 nova_compute[183177]: 2026-01-26 19:58:40.212 183181 INFO nova.compute.manager [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Post operation of migration started
Jan 26 19:58:40 compute-0 nova_compute[183177]: 2026-01-26 19:58:40.213 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:40 compute-0 nova_compute[183177]: 2026-01-26 19:58:40.890 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:40 compute-0 nova_compute[183177]: 2026-01-26 19:58:40.891 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:41 compute-0 nova_compute[183177]: 2026-01-26 19:58:41.010 183181 DEBUG oslo_concurrency.lockutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-627aba4c-49d0-4c24-ba4b-ef4bd4843932" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:58:41 compute-0 nova_compute[183177]: 2026-01-26 19:58:41.011 183181 DEBUG oslo_concurrency.lockutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-627aba4c-49d0-4c24-ba4b-ef4bd4843932" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:58:41 compute-0 nova_compute[183177]: 2026-01-26 19:58:41.011 183181 DEBUG nova.network.neutron [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:58:41 compute-0 nova_compute[183177]: 2026-01-26 19:58:41.473 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:41 compute-0 nova_compute[183177]: 2026-01-26 19:58:41.513 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:41 compute-0 nova_compute[183177]: 2026-01-26 19:58:41.518 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:41 compute-0 sshd-session[212011]: Connection closed by authenticating user root 217.71.201.142 port 43116 [preauth]
Jan 26 19:58:42 compute-0 nova_compute[183177]: 2026-01-26 19:58:42.092 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:42 compute-0 nova_compute[183177]: 2026-01-26 19:58:42.264 183181 DEBUG nova.network.neutron [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Updating instance_info_cache with network_info: [{"id": "7c52c835-1a73-4fe5-bfd5-4754a6b58543", "address": "fa:16:3e:6d:ff:c7", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c52c835-1a", "ovs_interfaceid": "7c52c835-1a73-4fe5-bfd5-4754a6b58543", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:58:42 compute-0 sshd-session[212013]: Connection closed by authenticating user root 217.71.201.142 port 43124 [preauth]
Jan 26 19:58:42 compute-0 nova_compute[183177]: 2026-01-26 19:58:42.772 183181 DEBUG oslo_concurrency.lockutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-627aba4c-49d0-4c24-ba4b-ef4bd4843932" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:58:43 compute-0 nova_compute[183177]: 2026-01-26 19:58:43.298 183181 DEBUG oslo_concurrency.lockutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:58:43 compute-0 nova_compute[183177]: 2026-01-26 19:58:43.299 183181 DEBUG oslo_concurrency.lockutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:58:43 compute-0 nova_compute[183177]: 2026-01-26 19:58:43.300 183181 DEBUG oslo_concurrency.lockutils [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:58:43 compute-0 nova_compute[183177]: 2026-01-26 19:58:43.308 183181 INFO nova.virt.libvirt.driver [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 26 19:58:43 compute-0 virtqemud[182929]: Domain id=14 name='instance-00000015' uuid=627aba4c-49d0-4c24-ba4b-ef4bd4843932 is tainted: custom-monitor
Jan 26 19:58:44 compute-0 nova_compute[183177]: 2026-01-26 19:58:44.320 183181 INFO nova.virt.libvirt.driver [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 26 19:58:44 compute-0 nova_compute[183177]: 2026-01-26 19:58:44.598 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:58:45 compute-0 nova_compute[183177]: 2026-01-26 19:58:45.329 183181 INFO nova.virt.libvirt.driver [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 26 19:58:45 compute-0 nova_compute[183177]: 2026-01-26 19:58:45.335 183181 DEBUG nova.compute.manager [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 19:58:45 compute-0 nova_compute[183177]: 2026-01-26 19:58:45.846 183181 DEBUG nova.objects.instance [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Jan 26 19:58:45 compute-0 sshd-session[212015]: Connection closed by authenticating user root 217.71.201.142 port 43128 [preauth]
Jan 26 19:58:46 compute-0 nova_compute[183177]: 2026-01-26 19:58:46.514 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:46 compute-0 nova_compute[183177]: 2026-01-26 19:58:46.516 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:46 compute-0 sshd-session[212017]: Connection closed by authenticating user root 193.32.162.151 port 37804 [preauth]
Jan 26 19:58:46 compute-0 nova_compute[183177]: 2026-01-26 19:58:46.870 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:47 compute-0 sshd-session[212019]: Connection closed by authenticating user root 217.71.201.142 port 43142 [preauth]
Jan 26 19:58:47 compute-0 nova_compute[183177]: 2026-01-26 19:58:47.901 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:47 compute-0 nova_compute[183177]: 2026-01-26 19:58:47.901 183181 WARNING neutronclient.v2_0.client [None req-cbc2f21b-4676-4679-8203-f1955fd125aa 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:48 compute-0 sshd-session[212021]: Connection closed by authenticating user root 217.71.201.142 port 43150 [preauth]
Jan 26 19:58:48 compute-0 sshd-session[212023]: Connection closed by authenticating user root 217.71.201.142 port 43154 [preauth]
Jan 26 19:58:50 compute-0 sshd-session[212025]: Connection closed by authenticating user root 217.71.201.142 port 43158 [preauth]
Jan 26 19:58:51 compute-0 sshd-session[212027]: Connection closed by authenticating user root 217.71.201.142 port 43162 [preauth]
Jan 26 19:58:51 compute-0 nova_compute[183177]: 2026-01-26 19:58:51.556 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 19:58:51 compute-0 nova_compute[183177]: 2026-01-26 19:58:51.557 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:51 compute-0 nova_compute[183177]: 2026-01-26 19:58:51.557 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 26 19:58:51 compute-0 nova_compute[183177]: 2026-01-26 19:58:51.557 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 19:58:51 compute-0 nova_compute[183177]: 2026-01-26 19:58:51.558 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 19:58:51 compute-0 nova_compute[183177]: 2026-01-26 19:58:51.559 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:51 compute-0 sshd-session[212030]: Connection closed by authenticating user root 217.71.201.142 port 43164 [preauth]
Jan 26 19:58:52 compute-0 sshd-session[212032]: Connection closed by authenticating user root 217.71.201.142 port 43168 [preauth]
Jan 26 19:58:53 compute-0 sshd-session[212034]: Connection closed by authenticating user root 217.71.201.142 port 43172 [preauth]
Jan 26 19:58:53 compute-0 sshd-session[212036]: Connection closed by authenticating user root 188.166.116.149 port 39332 [preauth]
Jan 26 19:58:53 compute-0 sshd-session[212038]: Connection closed by authenticating user root 217.71.201.142 port 43178 [preauth]
Jan 26 19:58:54 compute-0 podman[212041]: 2026-01-26 19:58:54.414368001 +0000 UTC m=+0.089660341 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7)
Jan 26 19:58:54 compute-0 podman[212042]: 2026-01-26 19:58:54.433722677 +0000 UTC m=+0.103515970 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible)
Jan 26 19:58:54 compute-0 podman[212040]: 2026-01-26 19:58:54.444247997 +0000 UTC m=+0.119468865 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 19:58:55 compute-0 nova_compute[183177]: 2026-01-26 19:58:55.488 183181 DEBUG oslo_concurrency.lockutils [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquiring lock "627aba4c-49d0-4c24-ba4b-ef4bd4843932" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:58:55 compute-0 nova_compute[183177]: 2026-01-26 19:58:55.489 183181 DEBUG oslo_concurrency.lockutils [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "627aba4c-49d0-4c24-ba4b-ef4bd4843932" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:58:55 compute-0 nova_compute[183177]: 2026-01-26 19:58:55.489 183181 DEBUG oslo_concurrency.lockutils [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquiring lock "627aba4c-49d0-4c24-ba4b-ef4bd4843932-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:58:55 compute-0 nova_compute[183177]: 2026-01-26 19:58:55.489 183181 DEBUG oslo_concurrency.lockutils [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "627aba4c-49d0-4c24-ba4b-ef4bd4843932-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:58:55 compute-0 nova_compute[183177]: 2026-01-26 19:58:55.490 183181 DEBUG oslo_concurrency.lockutils [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "627aba4c-49d0-4c24-ba4b-ef4bd4843932-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:58:55 compute-0 nova_compute[183177]: 2026-01-26 19:58:55.508 183181 INFO nova.compute.manager [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Terminating instance
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.026 183181 DEBUG nova.compute.manager [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 19:58:56 compute-0 kernel: tap7c52c835-1a (unregistering): left promiscuous mode
Jan 26 19:58:56 compute-0 NetworkManager[55489]: <info>  [1769457536.0569] device (tap7c52c835-1a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.072 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:56 compute-0 ovn_controller[95396]: 2026-01-26T19:58:56Z|00158|binding|INFO|Releasing lport 7c52c835-1a73-4fe5-bfd5-4754a6b58543 from this chassis (sb_readonly=0)
Jan 26 19:58:56 compute-0 ovn_controller[95396]: 2026-01-26T19:58:56Z|00159|binding|INFO|Setting lport 7c52c835-1a73-4fe5-bfd5-4754a6b58543 down in Southbound
Jan 26 19:58:56 compute-0 ovn_controller[95396]: 2026-01-26T19:58:56Z|00160|binding|INFO|Removing iface tap7c52c835-1a ovn-installed in OVS
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.076 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.084 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:ff:c7 10.100.0.13'], port_security=['fa:16:3e:6d:ff:c7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '627aba4c-49d0-4c24-ba4b-ef4bd4843932', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd75b993944424869a47d42c106a38c67', 'neutron:revision_number': '15', 'neutron:security_group_ids': '89a2c834-f9c0-4d4f-90b6-d26cac037c36', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4429b655-2f88-4c91-b9bf-1d7728eb783d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=7c52c835-1a73-4fe5-bfd5-4754a6b58543) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.085 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 7c52c835-1a73-4fe5-bfd5-4754a6b58543 in datapath 4bd32c7b-f3d3-4dc7-82df-0adf92573e26 unbound from our chassis
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.087 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4bd32c7b-f3d3-4dc7-82df-0adf92573e26
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.098 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.114 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[38515297-023c-4e71-9468-000a92f1a35c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:56 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000015.scope: Deactivated successfully.
Jan 26 19:58:56 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000015.scope: Consumed 2.033s CPU time.
Jan 26 19:58:56 compute-0 systemd-machined[154465]: Machine qemu-14-instance-00000015 terminated.
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.156 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ba321b-6dfe-4031-b543-8df109357e47]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.159 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[b3123b10-09be-4cb9-83fd-780b2e94fff8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.206 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[7733fccf-d270-4f62-a17a-02795f116752]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.239 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f7d16615-c977-4412-a40f-f8041c9cd1a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4bd32c7b-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:a9:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490173, 'reachable_time': 37567, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212114, 'error': None, 'target': 'ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.271 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2793970f-c179-4ca1-a07d-04f2040a54a6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4bd32c7b-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490189, 'tstamp': 490189}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212117, 'error': None, 'target': 'ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4bd32c7b-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490193, 'tstamp': 490193}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212117, 'error': None, 'target': 'ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.272 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bd32c7b-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.274 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.283 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.283 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4bd32c7b-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.284 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.284 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4bd32c7b-f0, col_values=(('external_ids', {'iface-id': 'a64be302-a856-412d-b0cb-ff5bd9fb5cec'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.284 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 19:58:56 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:58:56.287 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8d0ef6c4-e388-4b0c-b4cf-37ca2ef3a59f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-4bd32c7b-f3d3-4dc7-82df-0adf92573e26\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 4bd32c7b-f3d3-4dc7-82df-0adf92573e26\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.328 183181 INFO nova.virt.libvirt.driver [-] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Instance destroyed successfully.
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.329 183181 DEBUG nova.objects.instance [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lazy-loading 'resources' on Instance uuid 627aba4c-49d0-4c24-ba4b-ef4bd4843932 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.600 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.842 183181 DEBUG nova.virt.libvirt.vif [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2026-01-26T19:57:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1069258687',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1069258687',id=21,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:57:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d75b993944424869a47d42c106a38c67',ramdisk_id='',reservation_id='r-ys7zwnp5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',clean_attempts='1',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:58:46Z,user_data=None,user_id='9a35d70b2552448eb80c3a52422369a8',uuid=627aba4c-49d0-4c24-ba4b-ef4bd4843932,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7c52c835-1a73-4fe5-bfd5-4754a6b58543", "address": "fa:16:3e:6d:ff:c7", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c52c835-1a", "ovs_interfaceid": "7c52c835-1a73-4fe5-bfd5-4754a6b58543", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.843 183181 DEBUG nova.network.os_vif_util [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Converting VIF {"id": "7c52c835-1a73-4fe5-bfd5-4754a6b58543", "address": "fa:16:3e:6d:ff:c7", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c52c835-1a", "ovs_interfaceid": "7c52c835-1a73-4fe5-bfd5-4754a6b58543", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.844 183181 DEBUG nova.network.os_vif_util [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6d:ff:c7,bridge_name='br-int',has_traffic_filtering=True,id=7c52c835-1a73-4fe5-bfd5-4754a6b58543,network=Network(4bd32c7b-f3d3-4dc7-82df-0adf92573e26),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c52c835-1a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.844 183181 DEBUG os_vif [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:ff:c7,bridge_name='br-int',has_traffic_filtering=True,id=7c52c835-1a73-4fe5-bfd5-4754a6b58543,network=Network(4bd32c7b-f3d3-4dc7-82df-0adf92573e26),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c52c835-1a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.847 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.847 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c52c835-1a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.849 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.851 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.853 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.853 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a00d0e22-6908-43bd-be81-938a5d05085d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.855 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.856 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.859 183181 INFO os_vif [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:ff:c7,bridge_name='br-int',has_traffic_filtering=True,id=7c52c835-1a73-4fe5-bfd5-4754a6b58543,network=Network(4bd32c7b-f3d3-4dc7-82df-0adf92573e26),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c52c835-1a')
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.860 183181 INFO nova.virt.libvirt.driver [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Deleting instance files /var/lib/nova/instances/627aba4c-49d0-4c24-ba4b-ef4bd4843932_del
Jan 26 19:58:56 compute-0 nova_compute[183177]: 2026-01-26 19:58:56.861 183181 INFO nova.virt.libvirt.driver [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Deletion of /var/lib/nova/instances/627aba4c-49d0-4c24-ba4b-ef4bd4843932_del complete
Jan 26 19:58:57 compute-0 nova_compute[183177]: 2026-01-26 19:58:57.140 183181 DEBUG nova.compute.manager [req-f8133b31-fbb9-4f38-bbeb-3d2e098c74a1 req-a92daf7a-4051-43b2-ab9f-ccccd889213e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Received event network-vif-unplugged-7c52c835-1a73-4fe5-bfd5-4754a6b58543 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:58:57 compute-0 nova_compute[183177]: 2026-01-26 19:58:57.140 183181 DEBUG oslo_concurrency.lockutils [req-f8133b31-fbb9-4f38-bbeb-3d2e098c74a1 req-a92daf7a-4051-43b2-ab9f-ccccd889213e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "627aba4c-49d0-4c24-ba4b-ef4bd4843932-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:58:57 compute-0 nova_compute[183177]: 2026-01-26 19:58:57.141 183181 DEBUG oslo_concurrency.lockutils [req-f8133b31-fbb9-4f38-bbeb-3d2e098c74a1 req-a92daf7a-4051-43b2-ab9f-ccccd889213e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "627aba4c-49d0-4c24-ba4b-ef4bd4843932-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:58:57 compute-0 nova_compute[183177]: 2026-01-26 19:58:57.141 183181 DEBUG oslo_concurrency.lockutils [req-f8133b31-fbb9-4f38-bbeb-3d2e098c74a1 req-a92daf7a-4051-43b2-ab9f-ccccd889213e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "627aba4c-49d0-4c24-ba4b-ef4bd4843932-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:58:57 compute-0 nova_compute[183177]: 2026-01-26 19:58:57.141 183181 DEBUG nova.compute.manager [req-f8133b31-fbb9-4f38-bbeb-3d2e098c74a1 req-a92daf7a-4051-43b2-ab9f-ccccd889213e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] No waiting events found dispatching network-vif-unplugged-7c52c835-1a73-4fe5-bfd5-4754a6b58543 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:58:57 compute-0 nova_compute[183177]: 2026-01-26 19:58:57.141 183181 DEBUG nova.compute.manager [req-f8133b31-fbb9-4f38-bbeb-3d2e098c74a1 req-a92daf7a-4051-43b2-ab9f-ccccd889213e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Received event network-vif-unplugged-7c52c835-1a73-4fe5-bfd5-4754a6b58543 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:58:57 compute-0 nova_compute[183177]: 2026-01-26 19:58:57.378 183181 INFO nova.compute.manager [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Took 1.35 seconds to destroy the instance on the hypervisor.
Jan 26 19:58:57 compute-0 nova_compute[183177]: 2026-01-26 19:58:57.379 183181 DEBUG oslo.service.backend._eventlet.loopingcall [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 19:58:57 compute-0 nova_compute[183177]: 2026-01-26 19:58:57.379 183181 DEBUG nova.compute.manager [-] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 19:58:57 compute-0 nova_compute[183177]: 2026-01-26 19:58:57.380 183181 DEBUG nova.network.neutron [-] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 19:58:57 compute-0 nova_compute[183177]: 2026-01-26 19:58:57.380 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:57 compute-0 nova_compute[183177]: 2026-01-26 19:58:57.667 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:58:57 compute-0 nova_compute[183177]: 2026-01-26 19:58:57.959 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:58:58 compute-0 nova_compute[183177]: 2026-01-26 19:58:58.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:58:58 compute-0 nova_compute[183177]: 2026-01-26 19:58:58.654 183181 DEBUG nova.compute.manager [req-4b44e512-e185-44d8-bcad-2945e19824c5 req-dcef6ab4-8504-43af-aef2-e5249dfcd43c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Received event network-vif-deleted-7c52c835-1a73-4fe5-bfd5-4754a6b58543 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:58:58 compute-0 nova_compute[183177]: 2026-01-26 19:58:58.655 183181 INFO nova.compute.manager [req-4b44e512-e185-44d8-bcad-2945e19824c5 req-dcef6ab4-8504-43af-aef2-e5249dfcd43c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Neutron deleted interface 7c52c835-1a73-4fe5-bfd5-4754a6b58543; detaching it from the instance and deleting it from the info cache
Jan 26 19:58:58 compute-0 nova_compute[183177]: 2026-01-26 19:58:58.656 183181 DEBUG nova.network.neutron [req-4b44e512-e185-44d8-bcad-2945e19824c5 req-dcef6ab4-8504-43af-aef2-e5249dfcd43c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:58:59 compute-0 nova_compute[183177]: 2026-01-26 19:58:59.061 183181 DEBUG nova.network.neutron [-] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:58:59 compute-0 nova_compute[183177]: 2026-01-26 19:58:59.166 183181 DEBUG nova.compute.manager [req-4b44e512-e185-44d8-bcad-2945e19824c5 req-dcef6ab4-8504-43af-aef2-e5249dfcd43c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Detach interface failed, port_id=7c52c835-1a73-4fe5-bfd5-4754a6b58543, reason: Instance 627aba4c-49d0-4c24-ba4b-ef4bd4843932 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 19:58:59 compute-0 nova_compute[183177]: 2026-01-26 19:58:59.189 183181 DEBUG nova.compute.manager [req-e9b7fcd7-3df1-44e0-881a-9c82d878dcf6 req-5deb2c2a-aae2-457a-aaaf-34b2f561cded 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Received event network-vif-unplugged-7c52c835-1a73-4fe5-bfd5-4754a6b58543 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:58:59 compute-0 nova_compute[183177]: 2026-01-26 19:58:59.189 183181 DEBUG oslo_concurrency.lockutils [req-e9b7fcd7-3df1-44e0-881a-9c82d878dcf6 req-5deb2c2a-aae2-457a-aaaf-34b2f561cded 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "627aba4c-49d0-4c24-ba4b-ef4bd4843932-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:58:59 compute-0 nova_compute[183177]: 2026-01-26 19:58:59.190 183181 DEBUG oslo_concurrency.lockutils [req-e9b7fcd7-3df1-44e0-881a-9c82d878dcf6 req-5deb2c2a-aae2-457a-aaaf-34b2f561cded 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "627aba4c-49d0-4c24-ba4b-ef4bd4843932-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:58:59 compute-0 nova_compute[183177]: 2026-01-26 19:58:59.190 183181 DEBUG oslo_concurrency.lockutils [req-e9b7fcd7-3df1-44e0-881a-9c82d878dcf6 req-5deb2c2a-aae2-457a-aaaf-34b2f561cded 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "627aba4c-49d0-4c24-ba4b-ef4bd4843932-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:58:59 compute-0 nova_compute[183177]: 2026-01-26 19:58:59.190 183181 DEBUG nova.compute.manager [req-e9b7fcd7-3df1-44e0-881a-9c82d878dcf6 req-5deb2c2a-aae2-457a-aaaf-34b2f561cded 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] No waiting events found dispatching network-vif-unplugged-7c52c835-1a73-4fe5-bfd5-4754a6b58543 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:58:59 compute-0 nova_compute[183177]: 2026-01-26 19:58:59.191 183181 DEBUG nova.compute.manager [req-e9b7fcd7-3df1-44e0-881a-9c82d878dcf6 req-5deb2c2a-aae2-457a-aaaf-34b2f561cded 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Received event network-vif-unplugged-7c52c835-1a73-4fe5-bfd5-4754a6b58543 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:58:59 compute-0 sshd-session[212134]: Connection closed by authenticating user root 142.93.140.142 port 40214 [preauth]
Jan 26 19:58:59 compute-0 nova_compute[183177]: 2026-01-26 19:58:59.570 183181 INFO nova.compute.manager [-] [instance: 627aba4c-49d0-4c24-ba4b-ef4bd4843932] Took 2.19 seconds to deallocate network for instance.
Jan 26 19:58:59 compute-0 podman[192499]: time="2026-01-26T19:58:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:58:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:58:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 19:58:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:58:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2635 "" "Go-http-client/1.1"
Jan 26 19:59:00 compute-0 nova_compute[183177]: 2026-01-26 19:59:00.094 183181 DEBUG oslo_concurrency.lockutils [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:00 compute-0 nova_compute[183177]: 2026-01-26 19:59:00.095 183181 DEBUG oslo_concurrency.lockutils [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:00 compute-0 nova_compute[183177]: 2026-01-26 19:59:00.101 183181 DEBUG oslo_concurrency.lockutils [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:00 compute-0 nova_compute[183177]: 2026-01-26 19:59:00.138 183181 INFO nova.scheduler.client.report [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Deleted allocations for instance 627aba4c-49d0-4c24-ba4b-ef4bd4843932
Jan 26 19:59:01 compute-0 nova_compute[183177]: 2026-01-26 19:59:01.177 183181 DEBUG oslo_concurrency.lockutils [None req-40503cf2-1261-4699-89f4-d667f46c9e30 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "627aba4c-49d0-4c24-ba4b-ef4bd4843932" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.688s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:01 compute-0 openstack_network_exporter[195363]: ERROR   19:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:59:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:59:01 compute-0 openstack_network_exporter[195363]: ERROR   19:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:59:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:59:01 compute-0 sshd-session[212136]: Connection closed by authenticating user root 217.71.201.142 port 43182 [preauth]
Jan 26 19:59:01 compute-0 nova_compute[183177]: 2026-01-26 19:59:01.652 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:01 compute-0 nova_compute[183177]: 2026-01-26 19:59:01.718 183181 DEBUG oslo_concurrency.lockutils [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquiring lock "3538b478-193d-4710-b409-b238c7fee35a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:01 compute-0 nova_compute[183177]: 2026-01-26 19:59:01.719 183181 DEBUG oslo_concurrency.lockutils [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:01 compute-0 nova_compute[183177]: 2026-01-26 19:59:01.720 183181 DEBUG oslo_concurrency.lockutils [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquiring lock "3538b478-193d-4710-b409-b238c7fee35a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:01 compute-0 nova_compute[183177]: 2026-01-26 19:59:01.720 183181 DEBUG oslo_concurrency.lockutils [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:01 compute-0 nova_compute[183177]: 2026-01-26 19:59:01.721 183181 DEBUG oslo_concurrency.lockutils [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:01 compute-0 nova_compute[183177]: 2026-01-26 19:59:01.736 183181 INFO nova.compute.manager [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Terminating instance
Jan 26 19:59:01 compute-0 nova_compute[183177]: 2026-01-26 19:59:01.855 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.260 183181 DEBUG nova.compute.manager [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 19:59:02 compute-0 kernel: tap2da42ab1-63 (unregistering): left promiscuous mode
Jan 26 19:59:02 compute-0 NetworkManager[55489]: <info>  [1769457542.2849] device (tap2da42ab1-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 19:59:02 compute-0 ovn_controller[95396]: 2026-01-26T19:59:02Z|00161|binding|INFO|Releasing lport 2da42ab1-63fa-490f-96ad-85fd462fa1a4 from this chassis (sb_readonly=0)
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.299 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:02 compute-0 ovn_controller[95396]: 2026-01-26T19:59:02Z|00162|binding|INFO|Setting lport 2da42ab1-63fa-490f-96ad-85fd462fa1a4 down in Southbound
Jan 26 19:59:02 compute-0 ovn_controller[95396]: 2026-01-26T19:59:02Z|00163|binding|INFO|Removing iface tap2da42ab1-63 ovn-installed in OVS
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.305 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.311 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:28:eb 10.100.0.10'], port_security=['fa:16:3e:db:28:eb 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '3538b478-193d-4710-b409-b238c7fee35a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd75b993944424869a47d42c106a38c67', 'neutron:revision_number': '5', 'neutron:security_group_ids': '89a2c834-f9c0-4d4f-90b6-d26cac037c36', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4429b655-2f88-4c91-b9bf-1d7728eb783d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=2da42ab1-63fa-490f-96ad-85fd462fa1a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.314 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 2da42ab1-63fa-490f-96ad-85fd462fa1a4 in datapath 4bd32c7b-f3d3-4dc7-82df-0adf92573e26 unbound from our chassis
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.315 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4bd32c7b-f3d3-4dc7-82df-0adf92573e26, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.316 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6024508b-ab3f-4866-b2b5-9b3a4613955f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.317 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26 namespace which is not needed anymore
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.331 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:02 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000014.scope: Deactivated successfully.
Jan 26 19:59:02 compute-0 podman[212138]: 2026-01-26 19:59:02.342705116 +0000 UTC m=+0.088883610 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:59:02 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000014.scope: Consumed 17.722s CPU time.
Jan 26 19:59:02 compute-0 systemd-machined[154465]: Machine qemu-13-instance-00000014 terminated.
Jan 26 19:59:02 compute-0 neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26[211596]: [NOTICE]   (211600) : haproxy version is 3.0.5-8e879a5
Jan 26 19:59:02 compute-0 neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26[211596]: [NOTICE]   (211600) : path to executable is /usr/sbin/haproxy
Jan 26 19:59:02 compute-0 neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26[211596]: [WARNING]  (211600) : Exiting Master process...
Jan 26 19:59:02 compute-0 podman[212187]: 2026-01-26 19:59:02.489357904 +0000 UTC m=+0.053667571 container kill 86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Jan 26 19:59:02 compute-0 neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26[211596]: [ALERT]    (211600) : Current worker (211602) exited with code 143 (Terminated)
Jan 26 19:59:02 compute-0 neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26[211596]: [WARNING]  (211600) : All workers exited. Exiting... (0)
Jan 26 19:59:02 compute-0 systemd[1]: libpod-86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c.scope: Deactivated successfully.
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.532 183181 DEBUG nova.compute.manager [req-2639ebf5-a637-418d-88e7-e26aa7181f26 req-07273273-96cc-4728-8a86-f5f7af5ede01 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Received event network-vif-unplugged-2da42ab1-63fa-490f-96ad-85fd462fa1a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.533 183181 DEBUG oslo_concurrency.lockutils [req-2639ebf5-a637-418d-88e7-e26aa7181f26 req-07273273-96cc-4728-8a86-f5f7af5ede01 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "3538b478-193d-4710-b409-b238c7fee35a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.534 183181 DEBUG oslo_concurrency.lockutils [req-2639ebf5-a637-418d-88e7-e26aa7181f26 req-07273273-96cc-4728-8a86-f5f7af5ede01 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.534 183181 DEBUG oslo_concurrency.lockutils [req-2639ebf5-a637-418d-88e7-e26aa7181f26 req-07273273-96cc-4728-8a86-f5f7af5ede01 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.536 183181 DEBUG nova.compute.manager [req-2639ebf5-a637-418d-88e7-e26aa7181f26 req-07273273-96cc-4728-8a86-f5f7af5ede01 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] No waiting events found dispatching network-vif-unplugged-2da42ab1-63fa-490f-96ad-85fd462fa1a4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.536 183181 DEBUG nova.compute.manager [req-2639ebf5-a637-418d-88e7-e26aa7181f26 req-07273273-96cc-4728-8a86-f5f7af5ede01 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Received event network-vif-unplugged-2da42ab1-63fa-490f-96ad-85fd462fa1a4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:59:02 compute-0 podman[212207]: 2026-01-26 19:59:02.541295348 +0000 UTC m=+0.033363449 container died 86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.563 183181 INFO nova.virt.libvirt.driver [-] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Instance destroyed successfully.
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.564 183181 DEBUG nova.objects.instance [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lazy-loading 'resources' on Instance uuid 3538b478-193d-4710-b409-b238c7fee35a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 19:59:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c-userdata-shm.mount: Deactivated successfully.
Jan 26 19:59:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-b7f5e34b5e1e0b47311717dfa263579a459c02fb6de46d8e559579bb3b7b5f32-merged.mount: Deactivated successfully.
Jan 26 19:59:02 compute-0 podman[212207]: 2026-01-26 19:59:02.593287655 +0000 UTC m=+0.085355696 container cleanup 86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 26 19:59:02 compute-0 systemd[1]: libpod-conmon-86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c.scope: Deactivated successfully.
Jan 26 19:59:02 compute-0 podman[212218]: 2026-01-26 19:59:02.614765627 +0000 UTC m=+0.082217413 container remove 86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.622 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[07bc8306-212b-4a24-88a8-bba666c9e733]: (4, ("Mon Jan 26 07:59:02 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26 (86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c)\n86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c\nMon Jan 26 07:59:02 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26 (86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c)\n86ffc58d6d893c3b51c3667f633c40cf3d3235a81aab32c04ded0cbc04e5310c\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.624 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[70f384e6-4ed7-489f-8458-4822573a207f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.625 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4bd32c7b-f3d3-4dc7-82df-0adf92573e26.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.625 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[06502984-670d-4738-ae92-c406e6b6c4fa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.626 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bd32c7b-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.628 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:02 compute-0 kernel: tap4bd32c7b-f0: left promiscuous mode
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.649 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:02 compute-0 nova_compute[183177]: 2026-01-26 19:59:02.650 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.654 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8b3ef273-5201-49ef-8c77-c66f829628a6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.669 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1729d9-f0dd-471a-9aac-cdcc57c1e123]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.671 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[630436ff-9ec5-445d-a01b-6ceb9d371308]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.698 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a76e3e0b-b635-4f96-8749-05ebabecd97f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490165, 'reachable_time': 23681, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212250, 'error': None, 'target': 'ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:59:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d4bd32c7b\x2df3d3\x2d4dc7\x2d82df\x2d0adf92573e26.mount: Deactivated successfully.
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.708 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4bd32c7b-f3d3-4dc7-82df-0adf92573e26 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 19:59:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:02.709 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[1d9436be-cba3-48f5-aea3-1a69b181991e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.071 183181 DEBUG nova.virt.libvirt.vif [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:57:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-922921175',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-922921175',id=20,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T19:57:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d75b993944424869a47d42c106a38c67',ramdisk_id='',reservation_id='r-c0r4mjx8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1405101307-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T19:57:24Z,user_data=None,user_id='9a35d70b2552448eb80c3a52422369a8',uuid=3538b478-193d-4710-b409-b238c7fee35a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "address": "fa:16:3e:db:28:eb", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da42ab1-63", "ovs_interfaceid": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.072 183181 DEBUG nova.network.os_vif_util [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Converting VIF {"id": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "address": "fa:16:3e:db:28:eb", "network": {"id": "4bd32c7b-f3d3-4dc7-82df-0adf92573e26", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1564221256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6eae384f11074574984dfe78117085bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da42ab1-63", "ovs_interfaceid": "2da42ab1-63fa-490f-96ad-85fd462fa1a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.073 183181 DEBUG nova.network.os_vif_util [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:28:eb,bridge_name='br-int',has_traffic_filtering=True,id=2da42ab1-63fa-490f-96ad-85fd462fa1a4,network=Network(4bd32c7b-f3d3-4dc7-82df-0adf92573e26),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da42ab1-63') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.074 183181 DEBUG os_vif [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:28:eb,bridge_name='br-int',has_traffic_filtering=True,id=2da42ab1-63fa-490f-96ad-85fd462fa1a4,network=Network(4bd32c7b-f3d3-4dc7-82df-0adf92573e26),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da42ab1-63') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.076 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.076 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2da42ab1-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.123 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.125 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.126 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.126 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=51a0ac28-dded-4b90-a6f6-36e42167526b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.128 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.129 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.131 183181 INFO os_vif [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:28:eb,bridge_name='br-int',has_traffic_filtering=True,id=2da42ab1-63fa-490f-96ad-85fd462fa1a4,network=Network(4bd32c7b-f3d3-4dc7-82df-0adf92573e26),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da42ab1-63')
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.132 183181 INFO nova.virt.libvirt.driver [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Deleting instance files /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a_del
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.134 183181 INFO nova.virt.libvirt.driver [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Deletion of /var/lib/nova/instances/3538b478-193d-4710-b409-b238c7fee35a_del complete
Jan 26 19:59:03 compute-0 sshd-session[212251]: Connection closed by authenticating user root 217.71.201.142 port 43222 [preauth]
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.650 183181 INFO nova.compute.manager [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Took 1.39 seconds to destroy the instance on the hypervisor.
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.651 183181 DEBUG oslo.service.backend._eventlet.loopingcall [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.651 183181 DEBUG nova.compute.manager [-] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.652 183181 DEBUG nova.network.neutron [-] [instance: 3538b478-193d-4710-b409-b238c7fee35a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.652 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:59:03 compute-0 nova_compute[183177]: 2026-01-26 19:59:03.954 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:59:03 compute-0 sshd-session[212253]: Connection closed by authenticating user root 217.71.201.142 port 43228 [preauth]
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.156 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.425 183181 DEBUG nova.compute.manager [req-efadefe5-6ae8-4a6f-8a0d-92a37cc972e5 req-8f750ae3-bc91-4157-bdd4-626efad5a078 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Received event network-vif-deleted-2da42ab1-63fa-490f-96ad-85fd462fa1a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.426 183181 INFO nova.compute.manager [req-efadefe5-6ae8-4a6f-8a0d-92a37cc972e5 req-8f750ae3-bc91-4157-bdd4-626efad5a078 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Neutron deleted interface 2da42ab1-63fa-490f-96ad-85fd462fa1a4; detaching it from the instance and deleting it from the info cache
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.426 183181 DEBUG nova.network.neutron [req-efadefe5-6ae8-4a6f-8a0d-92a37cc972e5 req-8f750ae3-bc91-4157-bdd4-626efad5a078 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.608 183181 DEBUG nova.compute.manager [req-ca70ff96-5616-4e31-aaa7-e0d896948830 req-ac8caef6-7252-422f-a5a4-7d7a82206510 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Received event network-vif-unplugged-2da42ab1-63fa-490f-96ad-85fd462fa1a4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.609 183181 DEBUG oslo_concurrency.lockutils [req-ca70ff96-5616-4e31-aaa7-e0d896948830 req-ac8caef6-7252-422f-a5a4-7d7a82206510 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "3538b478-193d-4710-b409-b238c7fee35a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.609 183181 DEBUG oslo_concurrency.lockutils [req-ca70ff96-5616-4e31-aaa7-e0d896948830 req-ac8caef6-7252-422f-a5a4-7d7a82206510 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.610 183181 DEBUG oslo_concurrency.lockutils [req-ca70ff96-5616-4e31-aaa7-e0d896948830 req-ac8caef6-7252-422f-a5a4-7d7a82206510 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.610 183181 DEBUG nova.compute.manager [req-ca70ff96-5616-4e31-aaa7-e0d896948830 req-ac8caef6-7252-422f-a5a4-7d7a82206510 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] No waiting events found dispatching network-vif-unplugged-2da42ab1-63fa-490f-96ad-85fd462fa1a4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.611 183181 DEBUG nova.compute.manager [req-ca70ff96-5616-4e31-aaa7-e0d896948830 req-ac8caef6-7252-422f-a5a4-7d7a82206510 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Received event network-vif-unplugged-2da42ab1-63fa-490f-96ad-85fd462fa1a4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 19:59:04 compute-0 sshd-session[212255]: Connection closed by authenticating user root 217.71.201.142 port 43232 [preauth]
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.673 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.675 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.863 183181 DEBUG nova.network.neutron [-] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.910 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.912 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.936 183181 DEBUG nova.compute.manager [req-efadefe5-6ae8-4a6f-8a0d-92a37cc972e5 req-8f750ae3-bc91-4157-bdd4-626efad5a078 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Detach interface failed, port_id=2da42ab1-63fa-490f-96ad-85fd462fa1a4, reason: Instance 3538b478-193d-4710-b409-b238c7fee35a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.954 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.954 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5740MB free_disk=73.09808731079102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.959 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:04 compute-0 nova_compute[183177]: 2026-01-26 19:59:04.960 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:05 compute-0 sshd-session[212258]: Connection closed by authenticating user root 217.71.201.142 port 43238 [preauth]
Jan 26 19:59:05 compute-0 nova_compute[183177]: 2026-01-26 19:59:05.374 183181 INFO nova.compute.manager [-] [instance: 3538b478-193d-4710-b409-b238c7fee35a] Took 1.72 seconds to deallocate network for instance.
Jan 26 19:59:05 compute-0 nova_compute[183177]: 2026-01-26 19:59:05.906 183181 DEBUG oslo_concurrency.lockutils [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:05 compute-0 sshd-session[212261]: Connection closed by authenticating user root 217.71.201.142 port 43242 [preauth]
Jan 26 19:59:06 compute-0 nova_compute[183177]: 2026-01-26 19:59:06.040 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 3538b478-193d-4710-b409-b238c7fee35a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 19:59:06 compute-0 nova_compute[183177]: 2026-01-26 19:59:06.041 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 19:59:06 compute-0 nova_compute[183177]: 2026-01-26 19:59:06.041 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 19:59:04 up  1:23,  0 user,  load average: 0.23, 0.24, 0.28\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_d75b993944424869a47d42c106a38c67': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 19:59:06 compute-0 nova_compute[183177]: 2026-01-26 19:59:06.245 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:59:06 compute-0 sshd-session[212263]: Connection closed by authenticating user root 217.71.201.142 port 43250 [preauth]
Jan 26 19:59:06 compute-0 nova_compute[183177]: 2026-01-26 19:59:06.654 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:06 compute-0 nova_compute[183177]: 2026-01-26 19:59:06.752 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:59:07 compute-0 sshd-session[212265]: Connection closed by authenticating user root 217.71.201.142 port 43256 [preauth]
Jan 26 19:59:07 compute-0 nova_compute[183177]: 2026-01-26 19:59:07.265 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 19:59:07 compute-0 nova_compute[183177]: 2026-01-26 19:59:07.266 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.306s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:07 compute-0 nova_compute[183177]: 2026-01-26 19:59:07.267 183181 DEBUG oslo_concurrency.lockutils [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 1.361s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:07 compute-0 nova_compute[183177]: 2026-01-26 19:59:07.311 183181 DEBUG nova.compute.provider_tree [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:59:07 compute-0 sshd-session[212267]: Connection closed by authenticating user root 217.71.201.142 port 43260 [preauth]
Jan 26 19:59:07 compute-0 nova_compute[183177]: 2026-01-26 19:59:07.818 183181 DEBUG nova.scheduler.client.report [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:59:08 compute-0 nova_compute[183177]: 2026-01-26 19:59:08.128 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:08 compute-0 nova_compute[183177]: 2026-01-26 19:59:08.259 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:59:08 compute-0 nova_compute[183177]: 2026-01-26 19:59:08.260 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:59:08 compute-0 nova_compute[183177]: 2026-01-26 19:59:08.260 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:59:08 compute-0 nova_compute[183177]: 2026-01-26 19:59:08.260 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 19:59:08 compute-0 nova_compute[183177]: 2026-01-26 19:59:08.334 183181 DEBUG oslo_concurrency.lockutils [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.068s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:08 compute-0 nova_compute[183177]: 2026-01-26 19:59:08.383 183181 INFO nova.scheduler.client.report [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Deleted allocations for instance 3538b478-193d-4710-b409-b238c7fee35a
Jan 26 19:59:08 compute-0 sshd-session[212269]: Connection closed by authenticating user root 217.71.201.142 port 43262 [preauth]
Jan 26 19:59:09 compute-0 sshd-session[212271]: Connection closed by authenticating user root 217.71.201.142 port 43268 [preauth]
Jan 26 19:59:09 compute-0 nova_compute[183177]: 2026-01-26 19:59:09.416 183181 DEBUG oslo_concurrency.lockutils [None req-b7084ac9-b7c8-4c04-a2d1-5f6e0fe85547 9a35d70b2552448eb80c3a52422369a8 d75b993944424869a47d42c106a38c67 - - default default] Lock "3538b478-193d-4710-b409-b238c7fee35a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.697s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:09 compute-0 sshd-session[212273]: Connection closed by authenticating user root 217.71.201.142 port 43270 [preauth]
Jan 26 19:59:10 compute-0 sshd-session[212275]: Connection closed by authenticating user root 217.71.201.142 port 43274 [preauth]
Jan 26 19:59:11 compute-0 nova_compute[183177]: 2026-01-26 19:59:11.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:59:11 compute-0 nova_compute[183177]: 2026-01-26 19:59:11.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:59:11 compute-0 nova_compute[183177]: 2026-01-26 19:59:11.656 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:12 compute-0 sshd-session[212277]: Connection closed by authenticating user root 217.71.201.142 port 43278 [preauth]
Jan 26 19:59:12 compute-0 sshd-session[212279]: Connection closed by authenticating user root 217.71.201.142 port 43290 [preauth]
Jan 26 19:59:12 compute-0 nova_compute[183177]: 2026-01-26 19:59:12.742 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:59:13 compute-0 nova_compute[183177]: 2026-01-26 19:59:13.133 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:14 compute-0 nova_compute[183177]: 2026-01-26 19:59:14.086 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:14 compute-0 sshd-session[212281]: Connection closed by authenticating user root 217.71.201.142 port 43292 [preauth]
Jan 26 19:59:15 compute-0 sshd-session[212283]: Connection closed by authenticating user root 217.71.201.142 port 43306 [preauth]
Jan 26 19:59:16 compute-0 sshd-session[212285]: Connection closed by authenticating user root 217.71.201.142 port 43312 [preauth]
Jan 26 19:59:16 compute-0 nova_compute[183177]: 2026-01-26 19:59:16.706 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:18 compute-0 nova_compute[183177]: 2026-01-26 19:59:18.166 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:20 compute-0 sshd-session[212287]: Connection closed by authenticating user root 217.71.201.142 port 43314 [preauth]
Jan 26 19:59:20 compute-0 sshd-session[212289]: Connection closed by authenticating user root 217.71.201.142 port 43330 [preauth]
Jan 26 19:59:21 compute-0 nova_compute[183177]: 2026-01-26 19:59:21.750 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:22 compute-0 sshd-session[212291]: Connection closed by authenticating user root 217.71.201.142 port 43332 [preauth]
Jan 26 19:59:23 compute-0 sshd-session[212293]: Connection closed by authenticating user root 217.71.201.142 port 43336 [preauth]
Jan 26 19:59:23 compute-0 nova_compute[183177]: 2026-01-26 19:59:23.204 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:23 compute-0 sshd-session[212295]: Connection closed by authenticating user root 217.71.201.142 port 43338 [preauth]
Jan 26 19:59:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:24.079 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:24.080 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:24.080 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:24 compute-0 sshd-session[212297]: Connection closed by authenticating user root 217.71.201.142 port 43342 [preauth]
Jan 26 19:59:25 compute-0 podman[212302]: 2026-01-26 19:59:25.359616815 +0000 UTC m=+0.099571796 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 19:59:25 compute-0 podman[212301]: 2026-01-26 19:59:25.365727957 +0000 UTC m=+0.099059821 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 19:59:25 compute-0 podman[212300]: 2026-01-26 19:59:25.451958435 +0000 UTC m=+0.190440437 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260120, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 26 19:59:26 compute-0 sshd-session[212359]: Connection closed by authenticating user root 217.71.201.142 port 43348 [preauth]
Jan 26 19:59:26 compute-0 nova_compute[183177]: 2026-01-26 19:59:26.752 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:27 compute-0 sshd-session[212361]: Connection closed by authenticating user root 217.71.201.142 port 43358 [preauth]
Jan 26 19:59:28 compute-0 nova_compute[183177]: 2026-01-26 19:59:28.206 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:28 compute-0 sshd-session[212363]: Connection closed by authenticating user root 217.71.201.142 port 43368 [preauth]
Jan 26 19:59:29 compute-0 podman[192499]: time="2026-01-26T19:59:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:59:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:59:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:59:29 compute-0 podman[192499]: @ - - [26/Jan/2026:19:59:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 26 19:59:30 compute-0 sshd-session[212365]: Connection closed by authenticating user root 217.71.201.142 port 43372 [preauth]
Jan 26 19:59:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:30.060 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:e4:a9 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '351c14c8e6aa42869583e2a01f2ea90f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee375b04-bdd2-4f20-981a-bcd11f6bf341, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7a5d2ac3-15cd-4ece-b5ef-f600fe36ec74) old=Port_Binding(mac=['fa:16:3e:38:e4:a9'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '351c14c8e6aa42869583e2a01f2ea90f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:59:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:30.061 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7a5d2ac3-15cd-4ece-b5ef-f600fe36ec74 in datapath d1220de0-89fa-4020-84a8-6d0a816a5b3c updated
Jan 26 19:59:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:30.062 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1220de0-89fa-4020-84a8-6d0a816a5b3c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:59:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:30.063 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[131f6d28-ddc0-4612-bfd7-143a9ef2518b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:59:30 compute-0 sshd-session[212367]: Connection closed by authenticating user root 217.71.201.142 port 43392 [preauth]
Jan 26 19:59:31 compute-0 openstack_network_exporter[195363]: ERROR   19:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 19:59:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:59:31 compute-0 openstack_network_exporter[195363]: ERROR   19:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 19:59:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 19:59:31 compute-0 nova_compute[183177]: 2026-01-26 19:59:31.796 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:32 compute-0 sshd-session[212369]: Connection closed by authenticating user root 217.71.201.142 port 43396 [preauth]
Jan 26 19:59:32 compute-0 sshd-session[212371]: Connection closed by authenticating user root 217.71.201.142 port 43402 [preauth]
Jan 26 19:59:33 compute-0 nova_compute[183177]: 2026-01-26 19:59:33.208 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:33 compute-0 podman[212375]: 2026-01-26 19:59:33.339494182 +0000 UTC m=+0.090446961 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 19:59:33 compute-0 sshd-session[212373]: Connection closed by authenticating user root 217.71.201.142 port 43406 [preauth]
Jan 26 19:59:34 compute-0 sshd-session[212397]: Connection closed by authenticating user root 217.71.201.142 port 43408 [preauth]
Jan 26 19:59:34 compute-0 sshd-session[212399]: Connection closed by authenticating user root 217.71.201.142 port 43412 [preauth]
Jan 26 19:59:35 compute-0 sshd-session[212403]: Connection closed by authenticating user root 217.71.201.142 port 43414 [preauth]
Jan 26 19:59:35 compute-0 sshd-session[212401]: Connection closed by authenticating user root 142.93.140.142 port 37338 [preauth]
Jan 26 19:59:36 compute-0 sshd-session[212405]: Connection closed by authenticating user root 188.166.116.149 port 49122 [preauth]
Jan 26 19:59:36 compute-0 nova_compute[183177]: 2026-01-26 19:59:36.850 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:37 compute-0 sshd-session[212407]: Connection closed by authenticating user root 217.71.201.142 port 43416 [preauth]
Jan 26 19:59:37 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:37.335 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:59:37 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:37.337 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 19:59:37 compute-0 nova_compute[183177]: 2026-01-26 19:59:37.337 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:37 compute-0 sshd-session[212409]: Connection closed by authenticating user root 217.71.201.142 port 43424 [preauth]
Jan 26 19:59:38 compute-0 nova_compute[183177]: 2026-01-26 19:59:38.211 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:39 compute-0 sshd-session[212412]: Connection closed by authenticating user root 217.71.201.142 port 43428 [preauth]
Jan 26 19:59:40 compute-0 sshd-session[212414]: Connection closed by authenticating user root 217.71.201.142 port 43436 [preauth]
Jan 26 19:59:40 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:40.338 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 19:59:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:41.241 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:be:6c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6b015465-355d-49e7-ac68-0b393cdea65e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b015465-355d-49e7-ac68-0b393cdea65e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ab7d887b45a437cabdface06e8a9be1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9fe30cae-f9c6-410f-95f9-329aa4334b1f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5b64551c-b367-48c4-a3fd-4d27f0a45d8c) old=Port_Binding(mac=['fa:16:3e:49:be:6c'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-6b015465-355d-49e7-ac68-0b393cdea65e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b015465-355d-49e7-ac68-0b393cdea65e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ab7d887b45a437cabdface06e8a9be1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 19:59:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:41.243 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5b64551c-b367-48c4-a3fd-4d27f0a45d8c in datapath 6b015465-355d-49e7-ac68-0b393cdea65e updated
Jan 26 19:59:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:41.244 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b015465-355d-49e7-ac68-0b393cdea65e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 19:59:41 compute-0 ovn_metadata_agent[104667]: 2026-01-26 19:59:41.246 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[fc092069-a652-4b47-a7fa-627cb7816cf3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 19:59:41 compute-0 sshd-session[212416]: Connection closed by authenticating user root 217.71.201.142 port 43438 [preauth]
Jan 26 19:59:41 compute-0 nova_compute[183177]: 2026-01-26 19:59:41.849 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:43 compute-0 nova_compute[183177]: 2026-01-26 19:59:43.245 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:46 compute-0 nova_compute[183177]: 2026-01-26 19:59:46.922 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:48 compute-0 nova_compute[183177]: 2026-01-26 19:59:48.296 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:48 compute-0 ovn_controller[95396]: 2026-01-26T19:59:48Z|00164|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 19:59:49 compute-0 nova_compute[183177]: 2026-01-26 19:59:49.872 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "38add01c-5130-4743-8bfc-a2cd9eef81ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:49 compute-0 nova_compute[183177]: 2026-01-26 19:59:49.872 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:50 compute-0 nova_compute[183177]: 2026-01-26 19:59:50.379 183181 DEBUG nova.compute.manager [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 19:59:51 compute-0 nova_compute[183177]: 2026-01-26 19:59:51.459 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:51 compute-0 nova_compute[183177]: 2026-01-26 19:59:51.460 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:51 compute-0 nova_compute[183177]: 2026-01-26 19:59:51.470 183181 DEBUG nova.virt.hardware [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 19:59:51 compute-0 nova_compute[183177]: 2026-01-26 19:59:51.471 183181 INFO nova.compute.claims [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Claim successful on node compute-0.ctlplane.example.com
Jan 26 19:59:51 compute-0 nova_compute[183177]: 2026-01-26 19:59:51.950 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:52 compute-0 nova_compute[183177]: 2026-01-26 19:59:52.548 183181 DEBUG nova.compute.provider_tree [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 19:59:53 compute-0 nova_compute[183177]: 2026-01-26 19:59:53.063 183181 DEBUG nova.scheduler.client.report [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 19:59:53 compute-0 nova_compute[183177]: 2026-01-26 19:59:53.331 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:53 compute-0 nova_compute[183177]: 2026-01-26 19:59:53.577 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.117s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:53 compute-0 nova_compute[183177]: 2026-01-26 19:59:53.579 183181 DEBUG nova.compute.manager [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 19:59:54 compute-0 nova_compute[183177]: 2026-01-26 19:59:54.099 183181 DEBUG nova.compute.manager [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 19:59:54 compute-0 nova_compute[183177]: 2026-01-26 19:59:54.099 183181 DEBUG nova.network.neutron [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 19:59:54 compute-0 nova_compute[183177]: 2026-01-26 19:59:54.100 183181 WARNING neutronclient.v2_0.client [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:59:54 compute-0 nova_compute[183177]: 2026-01-26 19:59:54.100 183181 WARNING neutronclient.v2_0.client [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:59:54 compute-0 nova_compute[183177]: 2026-01-26 19:59:54.608 183181 INFO nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 19:59:54 compute-0 nova_compute[183177]: 2026-01-26 19:59:54.756 183181 DEBUG nova.network.neutron [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Successfully created port: 471030f9-6931-4461-ad62-0082b9c63094 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 19:59:55 compute-0 nova_compute[183177]: 2026-01-26 19:59:55.118 183181 DEBUG nova.compute.manager [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 19:59:55 compute-0 nova_compute[183177]: 2026-01-26 19:59:55.449 183181 DEBUG nova.network.neutron [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Successfully updated port: 471030f9-6931-4461-ad62-0082b9c63094 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 19:59:55 compute-0 nova_compute[183177]: 2026-01-26 19:59:55.516 183181 DEBUG nova.compute.manager [req-c658fbfd-f176-45f1-b766-9dbb44602d56 req-a84e9930-7baf-4704-919e-cfcdcac85aa5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received event network-changed-471030f9-6931-4461-ad62-0082b9c63094 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 19:59:55 compute-0 nova_compute[183177]: 2026-01-26 19:59:55.516 183181 DEBUG nova.compute.manager [req-c658fbfd-f176-45f1-b766-9dbb44602d56 req-a84e9930-7baf-4704-919e-cfcdcac85aa5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Refreshing instance network info cache due to event network-changed-471030f9-6931-4461-ad62-0082b9c63094. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 19:59:55 compute-0 nova_compute[183177]: 2026-01-26 19:59:55.517 183181 DEBUG oslo_concurrency.lockutils [req-c658fbfd-f176-45f1-b766-9dbb44602d56 req-a84e9930-7baf-4704-919e-cfcdcac85aa5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-38add01c-5130-4743-8bfc-a2cd9eef81ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:59:55 compute-0 nova_compute[183177]: 2026-01-26 19:59:55.517 183181 DEBUG oslo_concurrency.lockutils [req-c658fbfd-f176-45f1-b766-9dbb44602d56 req-a84e9930-7baf-4704-919e-cfcdcac85aa5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-38add01c-5130-4743-8bfc-a2cd9eef81ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:59:55 compute-0 nova_compute[183177]: 2026-01-26 19:59:55.518 183181 DEBUG nova.network.neutron [req-c658fbfd-f176-45f1-b766-9dbb44602d56 req-a84e9930-7baf-4704-919e-cfcdcac85aa5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Refreshing network info cache for port 471030f9-6931-4461-ad62-0082b9c63094 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 19:59:55 compute-0 nova_compute[183177]: 2026-01-26 19:59:55.957 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "refresh_cache-38add01c-5130-4743-8bfc-a2cd9eef81ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.025 183181 WARNING neutronclient.v2_0.client [req-c658fbfd-f176-45f1-b766-9dbb44602d56 req-a84e9930-7baf-4704-919e-cfcdcac85aa5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.165 183181 DEBUG nova.compute.manager [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.167 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.168 183181 INFO nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Creating image(s)
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.169 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.169 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.170 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.172 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.180 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.182 183181 DEBUG oslo_concurrency.processutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.249 183181 DEBUG oslo_concurrency.processutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.251 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.252 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.253 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.262 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.263 183181 DEBUG oslo_concurrency.processutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.319 183181 DEBUG oslo_concurrency.processutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.320 183181 DEBUG oslo_concurrency.processutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:59:56 compute-0 podman[212421]: 2026-01-26 19:59:56.337253176 +0000 UTC m=+0.077442255 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 26 19:59:56 compute-0 podman[212423]: 2026-01-26 19:59:56.352107352 +0000 UTC m=+0.094403047 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.359 183181 DEBUG oslo_concurrency.processutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.360 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.361 183181 DEBUG oslo_concurrency.processutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:59:56 compute-0 podman[212420]: 2026-01-26 19:59:56.414271409 +0000 UTC m=+0.153936124 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.427 183181 DEBUG oslo_concurrency.processutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.428 183181 DEBUG nova.virt.disk.api [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Checking if we can resize image /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.429 183181 DEBUG oslo_concurrency.processutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.503 183181 DEBUG oslo_concurrency.processutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.504 183181 DEBUG nova.virt.disk.api [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Cannot resize image /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.504 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.504 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Ensure instance console log exists: /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.505 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.505 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.505 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 19:59:56 compute-0 nova_compute[183177]: 2026-01-26 19:59:56.960 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:57 compute-0 nova_compute[183177]: 2026-01-26 19:59:57.011 183181 DEBUG nova.network.neutron [req-c658fbfd-f176-45f1-b766-9dbb44602d56 req-a84e9930-7baf-4704-919e-cfcdcac85aa5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:59:57 compute-0 nova_compute[183177]: 2026-01-26 19:59:57.663 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:59:58 compute-0 nova_compute[183177]: 2026-01-26 19:59:58.072 183181 DEBUG nova.network.neutron [req-c658fbfd-f176-45f1-b766-9dbb44602d56 req-a84e9930-7baf-4704-919e-cfcdcac85aa5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 19:59:58 compute-0 nova_compute[183177]: 2026-01-26 19:59:58.335 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 19:59:58 compute-0 nova_compute[183177]: 2026-01-26 19:59:58.579 183181 DEBUG oslo_concurrency.lockutils [req-c658fbfd-f176-45f1-b766-9dbb44602d56 req-a84e9930-7baf-4704-919e-cfcdcac85aa5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-38add01c-5130-4743-8bfc-a2cd9eef81ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 19:59:58 compute-0 nova_compute[183177]: 2026-01-26 19:59:58.581 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquired lock "refresh_cache-38add01c-5130-4743-8bfc-a2cd9eef81ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 19:59:58 compute-0 nova_compute[183177]: 2026-01-26 19:59:58.581 183181 DEBUG nova.network.neutron [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 19:59:59 compute-0 nova_compute[183177]: 2026-01-26 19:59:59.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 19:59:59 compute-0 nova_compute[183177]: 2026-01-26 19:59:59.221 183181 DEBUG nova.network.neutron [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 19:59:59 compute-0 podman[192499]: time="2026-01-26T19:59:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 19:59:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:59:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 19:59:59 compute-0 podman[192499]: @ - - [26/Jan/2026:19:59:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2184 "" "Go-http-client/1.1"
Jan 26 20:00:00 compute-0 nova_compute[183177]: 2026-01-26 20:00:00.166 183181 WARNING neutronclient.v2_0.client [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:00:00 compute-0 nova_compute[183177]: 2026-01-26 20:00:00.979 183181 DEBUG nova.network.neutron [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Updating instance_info_cache with network_info: [{"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:00:01 compute-0 openstack_network_exporter[195363]: ERROR   20:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:00:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:00:01 compute-0 openstack_network_exporter[195363]: ERROR   20:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:00:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.494 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Releasing lock "refresh_cache-38add01c-5130-4743-8bfc-a2cd9eef81ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.495 183181 DEBUG nova.compute.manager [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Instance network_info: |[{"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.497 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Start _get_guest_xml network_info=[{"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.501 183181 WARNING nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.502 183181 DEBUG nova.virt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalanceStrategy-server-764172640', uuid='38add01c-5130-4743-8bfc-a2cd9eef81ef'), owner=OwnerMeta(userid='7033feaa27a8427197df3725be3d1a7a', username='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin', projectid='3ab7d887b45a437cabdface06e8a9be1', projectname='tempest-TestExecuteWorkloadBalanceStrategy-1111771467'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769457601.5027306) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.511 183181 DEBUG nova.virt.libvirt.host [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.512 183181 DEBUG nova.virt.libvirt.host [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.514 183181 DEBUG nova.virt.libvirt.host [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.515 183181 DEBUG nova.virt.libvirt.host [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.516 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.516 183181 DEBUG nova.virt.hardware [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.517 183181 DEBUG nova.virt.hardware [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.517 183181 DEBUG nova.virt.hardware [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.517 183181 DEBUG nova.virt.hardware [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.517 183181 DEBUG nova.virt.hardware [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.518 183181 DEBUG nova.virt.hardware [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.518 183181 DEBUG nova.virt.hardware [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.518 183181 DEBUG nova.virt.hardware [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.518 183181 DEBUG nova.virt.hardware [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.519 183181 DEBUG nova.virt.hardware [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.519 183181 DEBUG nova.virt.hardware [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.523 183181 DEBUG nova.virt.libvirt.vif [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:59:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-764172640',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-764172640',id=22,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-ktquge2x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:59:55Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=38add01c-5130-4743-8bfc-a2cd9eef81ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.524 183181 DEBUG nova.network.os_vif_util [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converting VIF {"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.524 183181 DEBUG nova.network.os_vif_util [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f7:01,bridge_name='br-int',has_traffic_filtering=True,id=471030f9-6931-4461-ad62-0082b9c63094,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471030f9-69') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.525 183181 DEBUG nova.objects.instance [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 38add01c-5130-4743-8bfc-a2cd9eef81ef obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:00:01 compute-0 nova_compute[183177]: 2026-01-26 20:00:01.996 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.034 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] End _get_guest_xml xml=<domain type="kvm">
Jan 26 20:00:02 compute-0 nova_compute[183177]:   <uuid>38add01c-5130-4743-8bfc-a2cd9eef81ef</uuid>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   <name>instance-00000016</name>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-764172640</nova:name>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:00:01</nova:creationTime>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:00:02 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:00:02 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:user uuid="7033feaa27a8427197df3725be3d1a7a">tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin</nova:user>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:project uuid="3ab7d887b45a437cabdface06e8a9be1">tempest-TestExecuteWorkloadBalanceStrategy-1111771467</nova:project>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         <nova:port uuid="471030f9-6931-4461-ad62-0082b9c63094">
Jan 26 20:00:02 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <system>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <entry name="serial">38add01c-5130-4743-8bfc-a2cd9eef81ef</entry>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <entry name="uuid">38add01c-5130-4743-8bfc-a2cd9eef81ef</entry>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     </system>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   <os>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   </os>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   <features>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   </features>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk.config"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:7d:f7:01"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <target dev="tap471030f9-69"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     </interface>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/console.log" append="off"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <video>
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     </video>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:00:02 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:00:02 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:00:02 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:00:02 compute-0 nova_compute[183177]: </domain>
Jan 26 20:00:02 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.034 183181 DEBUG nova.compute.manager [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Preparing to wait for external event network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.035 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.035 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.035 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.035 183181 DEBUG nova.virt.libvirt.vif [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T19:59:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-764172640',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-764172640',id=22,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-ktquge2x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T19:59:55Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=38add01c-5130-4743-8bfc-a2cd9eef81ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.036 183181 DEBUG nova.network.os_vif_util [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converting VIF {"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.036 183181 DEBUG nova.network.os_vif_util [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f7:01,bridge_name='br-int',has_traffic_filtering=True,id=471030f9-6931-4461-ad62-0082b9c63094,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471030f9-69') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.037 183181 DEBUG os_vif [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f7:01,bridge_name='br-int',has_traffic_filtering=True,id=471030f9-6931-4461-ad62-0082b9c63094,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471030f9-69') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.037 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.037 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.038 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.038 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.039 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ab0ea704-c571-51ac-a49a-6ef14fbaf1f7', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.040 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.041 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.042 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.045 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.045 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap471030f9-69, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.046 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap471030f9-69, col_values=(('qos', UUID('370d3c70-ed37-41ac-ab01-e2391c034d05')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.046 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap471030f9-69, col_values=(('external_ids', {'iface-id': '471030f9-6931-4461-ad62-0082b9c63094', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:f7:01', 'vm-uuid': '38add01c-5130-4743-8bfc-a2cd9eef81ef'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.047 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:02 compute-0 NetworkManager[55489]: <info>  [1769457602.0482] manager: (tap471030f9-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.049 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.058 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:02 compute-0 nova_compute[183177]: 2026-01-26 20:00:02.059 183181 INFO os_vif [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f7:01,bridge_name='br-int',has_traffic_filtering=True,id=471030f9-6931-4461-ad62-0082b9c63094,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471030f9-69')
Jan 26 20:00:03 compute-0 nova_compute[183177]: 2026-01-26 20:00:03.605 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:00:03 compute-0 nova_compute[183177]: 2026-01-26 20:00:03.605 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:00:03 compute-0 nova_compute[183177]: 2026-01-26 20:00:03.605 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] No VIF found with MAC fa:16:3e:7d:f7:01, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 20:00:03 compute-0 nova_compute[183177]: 2026-01-26 20:00:03.606 183181 INFO nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Using config drive
Jan 26 20:00:04 compute-0 nova_compute[183177]: 2026-01-26 20:00:04.118 183181 WARNING neutronclient.v2_0.client [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:00:04 compute-0 podman[212496]: 2026-01-26 20:00:04.347834843 +0000 UTC m=+0.092020803 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.108 183181 INFO nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Creating config drive at /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk.config
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.119 183181 DEBUG oslo_concurrency.processutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp22lq9lal execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.150 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.264 183181 DEBUG oslo_concurrency.processutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp22lq9lal" returned: 0 in 0.145s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:00:05 compute-0 kernel: tap471030f9-69: entered promiscuous mode
Jan 26 20:00:05 compute-0 NetworkManager[55489]: <info>  [1769457605.3607] manager: (tap471030f9-69): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Jan 26 20:00:05 compute-0 ovn_controller[95396]: 2026-01-26T20:00:05Z|00165|binding|INFO|Claiming lport 471030f9-6931-4461-ad62-0082b9c63094 for this chassis.
Jan 26 20:00:05 compute-0 ovn_controller[95396]: 2026-01-26T20:00:05Z|00166|binding|INFO|471030f9-6931-4461-ad62-0082b9c63094: Claiming fa:16:3e:7d:f7:01 10.100.0.3
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.406 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:05 compute-0 systemd-udevd[212535]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:00:05 compute-0 NetworkManager[55489]: <info>  [1769457605.4330] device (tap471030f9-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 20:00:05 compute-0 NetworkManager[55489]: <info>  [1769457605.4350] device (tap471030f9-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.441 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:f7:01 10.100.0.3'], port_security=['fa:16:3e:7d:f7:01 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '38add01c-5130-4743-8bfc-a2cd9eef81ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ab7d887b45a437cabdface06e8a9be1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0cb4f9c1-bc93-4265-b41c-06f910025486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee375b04-bdd2-4f20-981a-bcd11f6bf341, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=471030f9-6931-4461-ad62-0082b9c63094) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.442 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 471030f9-6931-4461-ad62-0082b9c63094 in datapath d1220de0-89fa-4020-84a8-6d0a816a5b3c bound to our chassis
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.443 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1220de0-89fa-4020-84a8-6d0a816a5b3c
Jan 26 20:00:05 compute-0 systemd-machined[154465]: New machine qemu-15-instance-00000016.
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.458 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[952f1b7a-8c5d-49ea-b795-048b34f37a7e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.459 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd1220de0-81 in ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.461 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd1220de0-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.461 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f628e8-9b08-42cf-8264-cbb2209263a5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.462 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b43d6507-fd73-481c-b460-8b3e2b87c724]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.476 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[8f22e700-870c-4d82-930a-309bd4799137]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_controller[95396]: 2026-01-26T20:00:05Z|00167|binding|INFO|Setting lport 471030f9-6931-4461-ad62-0082b9c63094 ovn-installed in OVS
Jan 26 20:00:05 compute-0 ovn_controller[95396]: 2026-01-26T20:00:05Z|00168|binding|INFO|Setting lport 471030f9-6931-4461-ad62-0082b9c63094 up in Southbound
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.487 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:05 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000016.
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.495 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4a236f-4803-4de1-8456-826417b52b41]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.526 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc6b04d-f9c2-46d3-8b4e-cf6317b6ec95]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.534 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[cb753cfb-86e8-436b-9a8f-4188528d2bd5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 systemd-udevd[212539]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:00:05 compute-0 NetworkManager[55489]: <info>  [1769457605.5361] manager: (tapd1220de0-80): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.572 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[100458ea-5a13-40ba-b93c-3ac74589b374]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.574 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[e73a33ed-bfc8-4a97-a29e-0393b986a6bc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 NetworkManager[55489]: <info>  [1769457605.6008] device (tapd1220de0-80): carrier: link connected
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.606 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[3bce04e0-f569-4ee2-a76c-a86235de9f74]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.625 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[55752b3f-28b6-46de-805f-fcf95728da09]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1220de0-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:e4:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506448, 'reachable_time': 31878, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212571, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.642 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[15359abf-0413-4335-9a0c-7f4b7b0758c6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:e4a9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506448, 'tstamp': 506448}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212572, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.659 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[ae531595-2d38-4421-b125-2d2903400b58]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1220de0-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:e4:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506448, 'reachable_time': 31878, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212573, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.667 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.668 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.669 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.705 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[c73b7977-bdc1-437c-8a77-78b416c7eb43]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.800 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e65814fd-ebc4-436e-b2b6-ad2c1ac4e5e0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.802 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1220de0-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.803 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.803 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1220de0-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.805 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:05 compute-0 NetworkManager[55489]: <info>  [1769457605.8064] manager: (tapd1220de0-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Jan 26 20:00:05 compute-0 kernel: tapd1220de0-80: entered promiscuous mode
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.810 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.811 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1220de0-80, col_values=(('external_ids', {'iface-id': '7a5d2ac3-15cd-4ece-b5ef-f600fe36ec74'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.812 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.814 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:05 compute-0 ovn_controller[95396]: 2026-01-26T20:00:05Z|00169|binding|INFO|Releasing lport 7a5d2ac3-15cd-4ece-b5ef-f600fe36ec74 from this chassis (sb_readonly=0)
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.816 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[c8df4f7b-1005-442b-9328-0eec2337c431]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.817 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.818 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.818 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d1220de0-89fa-4020-84a8-6d0a816a5b3c disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.818 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.819 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6d88c7ed-5a0b-4433-90dc-d3ba2c416e4a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.820 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.820 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[0fbad1b8-299d-48d8-92fd-26fe51b664f0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.821 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: global
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-d1220de0-89fa-4020-84a8-6d0a816a5b3c
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID d1220de0-89fa-4020-84a8-6d0a816a5b3c
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 20:00:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:05.822 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'env', 'PROCESS_TAG=haproxy-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d1220de0-89fa-4020-84a8-6d0a816a5b3c.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 20:00:05 compute-0 nova_compute[183177]: 2026-01-26 20:00:05.858 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.201 183181 DEBUG nova.compute.manager [req-698594e3-ad0d-4f02-9701-fe1c39abc58e req-b4c3f091-aa14-4ea4-afe1-52857e5ec476 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received event network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.201 183181 DEBUG oslo_concurrency.lockutils [req-698594e3-ad0d-4f02-9701-fe1c39abc58e req-b4c3f091-aa14-4ea4-afe1-52857e5ec476 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.202 183181 DEBUG oslo_concurrency.lockutils [req-698594e3-ad0d-4f02-9701-fe1c39abc58e req-b4c3f091-aa14-4ea4-afe1-52857e5ec476 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.202 183181 DEBUG oslo_concurrency.lockutils [req-698594e3-ad0d-4f02-9701-fe1c39abc58e req-b4c3f091-aa14-4ea4-afe1-52857e5ec476 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.203 183181 DEBUG nova.compute.manager [req-698594e3-ad0d-4f02-9701-fe1c39abc58e req-b4c3f091-aa14-4ea4-afe1-52857e5ec476 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Processing event network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:00:06 compute-0 podman[212605]: 2026-01-26 20:00:06.342733131 +0000 UTC m=+0.086518197 container create 4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 20:00:06 compute-0 systemd[1]: Started libpod-conmon-4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808.scope.
Jan 26 20:00:06 compute-0 podman[212605]: 2026-01-26 20:00:06.298760189 +0000 UTC m=+0.042545295 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 20:00:06 compute-0 systemd[1]: Started libcrun container.
Jan 26 20:00:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e931d3a7067adc7f52a029288188d6ff7bc8ae757e28b957efc7697ad8496d9e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 20:00:06 compute-0 podman[212605]: 2026-01-26 20:00:06.447359499 +0000 UTC m=+0.191144575 container init 4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.449 183181 DEBUG nova.compute.manager [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.453 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 20:00:06 compute-0 podman[212605]: 2026-01-26 20:00:06.458566268 +0000 UTC m=+0.202351324 container start 4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120)
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.457 183181 INFO nova.virt.libvirt.driver [-] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Instance spawned successfully.
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.457 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 20:00:06 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[212627]: [NOTICE]   (212632) : New worker (212634) forked
Jan 26 20:00:06 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[212627]: [NOTICE]   (212632) : Loading success.
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.718 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.807 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.809 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.866 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.973 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.974 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.975 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.975 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.976 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.977 183181 DEBUG nova.virt.libvirt.driver [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:00:06 compute-0 nova_compute[183177]: 2026-01-26 20:00:06.997 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:07 compute-0 nova_compute[183177]: 2026-01-26 20:00:07.042 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:00:07 compute-0 nova_compute[183177]: 2026-01-26 20:00:07.044 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:00:07 compute-0 nova_compute[183177]: 2026-01-26 20:00:07.052 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:07 compute-0 nova_compute[183177]: 2026-01-26 20:00:07.064 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:00:07 compute-0 nova_compute[183177]: 2026-01-26 20:00:07.065 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5752MB free_disk=73.09736251831055GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:00:07 compute-0 nova_compute[183177]: 2026-01-26 20:00:07.065 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:07 compute-0 nova_compute[183177]: 2026-01-26 20:00:07.065 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:09 compute-0 nova_compute[183177]: 2026-01-26 20:00:09.960 183181 DEBUG nova.compute.manager [req-1fa47526-b6df-4503-a29c-8a119d443c08 req-223d59c4-d654-4d05-9610-7d89cdcef912 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received event network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:00:09 compute-0 nova_compute[183177]: 2026-01-26 20:00:09.962 183181 DEBUG oslo_concurrency.lockutils [req-1fa47526-b6df-4503-a29c-8a119d443c08 req-223d59c4-d654-4d05-9610-7d89cdcef912 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:09 compute-0 nova_compute[183177]: 2026-01-26 20:00:09.963 183181 DEBUG oslo_concurrency.lockutils [req-1fa47526-b6df-4503-a29c-8a119d443c08 req-223d59c4-d654-4d05-9610-7d89cdcef912 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:09 compute-0 nova_compute[183177]: 2026-01-26 20:00:09.964 183181 DEBUG oslo_concurrency.lockutils [req-1fa47526-b6df-4503-a29c-8a119d443c08 req-223d59c4-d654-4d05-9610-7d89cdcef912 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:09 compute-0 nova_compute[183177]: 2026-01-26 20:00:09.964 183181 DEBUG nova.compute.manager [req-1fa47526-b6df-4503-a29c-8a119d443c08 req-223d59c4-d654-4d05-9610-7d89cdcef912 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] No waiting events found dispatching network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:00:09 compute-0 nova_compute[183177]: 2026-01-26 20:00:09.965 183181 WARNING nova.compute.manager [req-1fa47526-b6df-4503-a29c-8a119d443c08 req-223d59c4-d654-4d05-9610-7d89cdcef912 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received unexpected event network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 for instance with vm_state building and task_state spawning.
Jan 26 20:00:10 compute-0 nova_compute[183177]: 2026-01-26 20:00:10.456 183181 INFO nova.compute.manager [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Took 14.29 seconds to spawn the instance on the hypervisor.
Jan 26 20:00:10 compute-0 nova_compute[183177]: 2026-01-26 20:00:10.456 183181 DEBUG nova.compute.manager [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 20:00:10 compute-0 nova_compute[183177]: 2026-01-26 20:00:10.997 183181 INFO nova.compute.manager [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Took 19.61 seconds to build instance.
Jan 26 20:00:11 compute-0 nova_compute[183177]: 2026-01-26 20:00:11.009 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 38add01c-5130-4743-8bfc-a2cd9eef81ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 20:00:11 compute-0 nova_compute[183177]: 2026-01-26 20:00:11.009 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:00:11 compute-0 nova_compute[183177]: 2026-01-26 20:00:11.009 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:00:07 up  1:24,  0 user,  load average: 0.13, 0.21, 0.27\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_3ab7d887b45a437cabdface06e8a9be1': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:00:11 compute-0 nova_compute[183177]: 2026-01-26 20:00:11.127 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:00:11 compute-0 nova_compute[183177]: 2026-01-26 20:00:11.505 183181 DEBUG oslo_concurrency.lockutils [None req-debcf086-dd47-4307-a47e-5ae3ea280174 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.633s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:11 compute-0 nova_compute[183177]: 2026-01-26 20:00:11.635 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:00:12 compute-0 nova_compute[183177]: 2026-01-26 20:00:12.031 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:12 compute-0 nova_compute[183177]: 2026-01-26 20:00:12.055 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:12 compute-0 nova_compute[183177]: 2026-01-26 20:00:12.144 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:00:12 compute-0 nova_compute[183177]: 2026-01-26 20:00:12.145 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.079s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:14 compute-0 sshd-session[212650]: Connection closed by authenticating user root 142.93.140.142 port 35862 [preauth]
Jan 26 20:00:15 compute-0 nova_compute[183177]: 2026-01-26 20:00:15.836 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:15 compute-0 nova_compute[183177]: 2026-01-26 20:00:15.838 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:16 compute-0 nova_compute[183177]: 2026-01-26 20:00:16.145 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:00:16 compute-0 nova_compute[183177]: 2026-01-26 20:00:16.145 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:00:16 compute-0 nova_compute[183177]: 2026-01-26 20:00:16.145 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:00:16 compute-0 nova_compute[183177]: 2026-01-26 20:00:16.146 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:00:16 compute-0 nova_compute[183177]: 2026-01-26 20:00:16.146 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:00:16 compute-0 nova_compute[183177]: 2026-01-26 20:00:16.346 183181 DEBUG nova.compute.manager [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 20:00:16 compute-0 nova_compute[183177]: 2026-01-26 20:00:16.896 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:16 compute-0 nova_compute[183177]: 2026-01-26 20:00:16.896 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:16 compute-0 nova_compute[183177]: 2026-01-26 20:00:16.902 183181 DEBUG nova.virt.hardware [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 20:00:16 compute-0 nova_compute[183177]: 2026-01-26 20:00:16.902 183181 INFO nova.compute.claims [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Claim successful on node compute-0.ctlplane.example.com
Jan 26 20:00:17 compute-0 nova_compute[183177]: 2026-01-26 20:00:17.056 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:00:17 compute-0 nova_compute[183177]: 2026-01-26 20:00:17.058 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:00:17 compute-0 nova_compute[183177]: 2026-01-26 20:00:17.058 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 26 20:00:17 compute-0 nova_compute[183177]: 2026-01-26 20:00:17.059 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 20:00:17 compute-0 nova_compute[183177]: 2026-01-26 20:00:17.069 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:17 compute-0 nova_compute[183177]: 2026-01-26 20:00:17.070 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 20:00:17 compute-0 nova_compute[183177]: 2026-01-26 20:00:17.976 183181 DEBUG nova.compute.provider_tree [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:00:18 compute-0 ovn_controller[95396]: 2026-01-26T20:00:18Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7d:f7:01 10.100.0.3
Jan 26 20:00:18 compute-0 ovn_controller[95396]: 2026-01-26T20:00:18Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7d:f7:01 10.100.0.3
Jan 26 20:00:18 compute-0 nova_compute[183177]: 2026-01-26 20:00:18.484 183181 DEBUG nova.scheduler.client.report [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:00:18 compute-0 nova_compute[183177]: 2026-01-26 20:00:18.993 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:18 compute-0 nova_compute[183177]: 2026-01-26 20:00:18.994 183181 DEBUG nova.compute.manager [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 20:00:19 compute-0 nova_compute[183177]: 2026-01-26 20:00:19.507 183181 DEBUG nova.compute.manager [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 20:00:19 compute-0 nova_compute[183177]: 2026-01-26 20:00:19.508 183181 DEBUG nova.network.neutron [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 20:00:19 compute-0 nova_compute[183177]: 2026-01-26 20:00:19.509 183181 WARNING neutronclient.v2_0.client [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:00:19 compute-0 nova_compute[183177]: 2026-01-26 20:00:19.509 183181 WARNING neutronclient.v2_0.client [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:00:19 compute-0 sshd-session[212663]: Connection closed by authenticating user root 188.166.116.149 port 40116 [preauth]
Jan 26 20:00:20 compute-0 nova_compute[183177]: 2026-01-26 20:00:20.017 183181 INFO nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 20:00:20 compute-0 nova_compute[183177]: 2026-01-26 20:00:20.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:00:20 compute-0 nova_compute[183177]: 2026-01-26 20:00:20.477 183181 DEBUG nova.network.neutron [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Successfully created port: be9c9205-fe76-40b8-9a54-e960c9a57576 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 20:00:20 compute-0 nova_compute[183177]: 2026-01-26 20:00:20.525 183181 DEBUG nova.compute.manager [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.355 183181 DEBUG nova.network.neutron [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Successfully updated port: be9c9205-fe76-40b8-9a54-e960c9a57576 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.432 183181 DEBUG nova.compute.manager [req-6621e54a-402e-4626-b43f-c45eb719bb0e req-573b8cb5-d57a-4b04-931a-2858dd26a0b8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Received event network-changed-be9c9205-fe76-40b8-9a54-e960c9a57576 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.432 183181 DEBUG nova.compute.manager [req-6621e54a-402e-4626-b43f-c45eb719bb0e req-573b8cb5-d57a-4b04-931a-2858dd26a0b8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Refreshing instance network info cache due to event network-changed-be9c9205-fe76-40b8-9a54-e960c9a57576. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.433 183181 DEBUG oslo_concurrency.lockutils [req-6621e54a-402e-4626-b43f-c45eb719bb0e req-573b8cb5-d57a-4b04-931a-2858dd26a0b8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-a68751cb-f30e-4bcd-a9d0-aaadca040a7c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.433 183181 DEBUG oslo_concurrency.lockutils [req-6621e54a-402e-4626-b43f-c45eb719bb0e req-573b8cb5-d57a-4b04-931a-2858dd26a0b8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-a68751cb-f30e-4bcd-a9d0-aaadca040a7c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.433 183181 DEBUG nova.network.neutron [req-6621e54a-402e-4626-b43f-c45eb719bb0e req-573b8cb5-d57a-4b04-931a-2858dd26a0b8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Refreshing network info cache for port be9c9205-fe76-40b8-9a54-e960c9a57576 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.545 183181 DEBUG nova.compute.manager [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.547 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.548 183181 INFO nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Creating image(s)
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.549 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "/var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.549 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "/var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.551 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "/var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.552 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.560 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.563 183181 DEBUG oslo_concurrency.processutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.655 183181 DEBUG oslo_concurrency.processutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.656 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.657 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.657 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.661 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.662 183181 DEBUG oslo_concurrency.processutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.724 183181 DEBUG oslo_concurrency.processutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.725 183181 DEBUG oslo_concurrency.processutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.764 183181 DEBUG oslo_concurrency.processutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.765 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.765 183181 DEBUG oslo_concurrency.processutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.811 183181 DEBUG oslo_concurrency.processutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.813 183181 DEBUG nova.virt.disk.api [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Checking if we can resize image /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.813 183181 DEBUG oslo_concurrency.processutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.861 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "refresh_cache-a68751cb-f30e-4bcd-a9d0-aaadca040a7c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.864 183181 DEBUG oslo_concurrency.processutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.864 183181 DEBUG nova.virt.disk.api [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Cannot resize image /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.865 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.865 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Ensure instance console log exists: /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.865 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.866 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.866 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:21 compute-0 nova_compute[183177]: 2026-01-26 20:00:21.938 183181 WARNING neutronclient.v2_0.client [req-6621e54a-402e-4626-b43f-c45eb719bb0e req-573b8cb5-d57a-4b04-931a-2858dd26a0b8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:00:22 compute-0 nova_compute[183177]: 2026-01-26 20:00:22.070 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:00:22 compute-0 nova_compute[183177]: 2026-01-26 20:00:22.072 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:00:22 compute-0 nova_compute[183177]: 2026-01-26 20:00:22.072 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 26 20:00:22 compute-0 nova_compute[183177]: 2026-01-26 20:00:22.072 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 20:00:22 compute-0 nova_compute[183177]: 2026-01-26 20:00:22.121 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:22 compute-0 nova_compute[183177]: 2026-01-26 20:00:22.122 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 26 20:00:22 compute-0 nova_compute[183177]: 2026-01-26 20:00:22.392 183181 DEBUG nova.network.neutron [req-6621e54a-402e-4626-b43f-c45eb719bb0e req-573b8cb5-d57a-4b04-931a-2858dd26a0b8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:00:22 compute-0 nova_compute[183177]: 2026-01-26 20:00:22.631 183181 DEBUG nova.network.neutron [req-6621e54a-402e-4626-b43f-c45eb719bb0e req-573b8cb5-d57a-4b04-931a-2858dd26a0b8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:00:23 compute-0 nova_compute[183177]: 2026-01-26 20:00:23.140 183181 DEBUG oslo_concurrency.lockutils [req-6621e54a-402e-4626-b43f-c45eb719bb0e req-573b8cb5-d57a-4b04-931a-2858dd26a0b8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-a68751cb-f30e-4bcd-a9d0-aaadca040a7c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:00:23 compute-0 nova_compute[183177]: 2026-01-26 20:00:23.142 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquired lock "refresh_cache-a68751cb-f30e-4bcd-a9d0-aaadca040a7c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:00:23 compute-0 nova_compute[183177]: 2026-01-26 20:00:23.142 183181 DEBUG nova.network.neutron [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 20:00:24 compute-0 nova_compute[183177]: 2026-01-26 20:00:24.018 183181 DEBUG nova.network.neutron [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:00:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:24.081 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:24.082 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:24.082 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:24 compute-0 nova_compute[183177]: 2026-01-26 20:00:24.442 183181 WARNING neutronclient.v2_0.client [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:00:24 compute-0 nova_compute[183177]: 2026-01-26 20:00:24.575 183181 DEBUG nova.network.neutron [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Updating instance_info_cache with network_info: [{"id": "be9c9205-fe76-40b8-9a54-e960c9a57576", "address": "fa:16:3e:d3:1b:7c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe9c9205-fe", "ovs_interfaceid": "be9c9205-fe76-40b8-9a54-e960c9a57576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.086 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Releasing lock "refresh_cache-a68751cb-f30e-4bcd-a9d0-aaadca040a7c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.087 183181 DEBUG nova.compute.manager [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Instance network_info: |[{"id": "be9c9205-fe76-40b8-9a54-e960c9a57576", "address": "fa:16:3e:d3:1b:7c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe9c9205-fe", "ovs_interfaceid": "be9c9205-fe76-40b8-9a54-e960c9a57576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.091 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Start _get_guest_xml network_info=[{"id": "be9c9205-fe76-40b8-9a54-e960c9a57576", "address": "fa:16:3e:d3:1b:7c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe9c9205-fe", "ovs_interfaceid": "be9c9205-fe76-40b8-9a54-e960c9a57576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.097 183181 WARNING nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.100 183181 DEBUG nova.virt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalanceStrategy-server-1078324294', uuid='a68751cb-f30e-4bcd-a9d0-aaadca040a7c'), owner=OwnerMeta(userid='7033feaa27a8427197df3725be3d1a7a', username='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin', projectid='3ab7d887b45a437cabdface06e8a9be1', projectname='tempest-TestExecuteWorkloadBalanceStrategy-1111771467'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "be9c9205-fe76-40b8-9a54-e960c9a57576", "address": "fa:16:3e:d3:1b:7c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe9c9205-fe", "ovs_interfaceid": "be9c9205-fe76-40b8-9a54-e960c9a57576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769457625.099868) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.104 183181 DEBUG nova.virt.libvirt.host [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.106 183181 DEBUG nova.virt.libvirt.host [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.110 183181 DEBUG nova.virt.libvirt.host [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.111 183181 DEBUG nova.virt.libvirt.host [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.113 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.114 183181 DEBUG nova.virt.hardware [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.114 183181 DEBUG nova.virt.hardware [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.115 183181 DEBUG nova.virt.hardware [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.115 183181 DEBUG nova.virt.hardware [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.116 183181 DEBUG nova.virt.hardware [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.116 183181 DEBUG nova.virt.hardware [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.117 183181 DEBUG nova.virt.hardware [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.117 183181 DEBUG nova.virt.hardware [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.117 183181 DEBUG nova.virt.hardware [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.118 183181 DEBUG nova.virt.hardware [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.118 183181 DEBUG nova.virt.hardware [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.126 183181 DEBUG nova.virt.libvirt.vif [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:00:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1078324294',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1078324294',id=23,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-zo0ufsle',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:00:20Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=a68751cb-f30e-4bcd-a9d0-aaadca040a7c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be9c9205-fe76-40b8-9a54-e960c9a57576", "address": "fa:16:3e:d3:1b:7c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe9c9205-fe", "ovs_interfaceid": "be9c9205-fe76-40b8-9a54-e960c9a57576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.127 183181 DEBUG nova.network.os_vif_util [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converting VIF {"id": "be9c9205-fe76-40b8-9a54-e960c9a57576", "address": "fa:16:3e:d3:1b:7c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe9c9205-fe", "ovs_interfaceid": "be9c9205-fe76-40b8-9a54-e960c9a57576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.128 183181 DEBUG nova.network.os_vif_util [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:1b:7c,bridge_name='br-int',has_traffic_filtering=True,id=be9c9205-fe76-40b8-9a54-e960c9a57576,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe9c9205-fe') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.130 183181 DEBUG nova.objects.instance [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lazy-loading 'pci_devices' on Instance uuid a68751cb-f30e-4bcd-a9d0-aaadca040a7c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.640 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] End _get_guest_xml xml=<domain type="kvm">
Jan 26 20:00:25 compute-0 nova_compute[183177]:   <uuid>a68751cb-f30e-4bcd-a9d0-aaadca040a7c</uuid>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   <name>instance-00000017</name>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1078324294</nova:name>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:00:25</nova:creationTime>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:00:25 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:00:25 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:user uuid="7033feaa27a8427197df3725be3d1a7a">tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin</nova:user>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:project uuid="3ab7d887b45a437cabdface06e8a9be1">tempest-TestExecuteWorkloadBalanceStrategy-1111771467</nova:project>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         <nova:port uuid="be9c9205-fe76-40b8-9a54-e960c9a57576">
Jan 26 20:00:25 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <system>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <entry name="serial">a68751cb-f30e-4bcd-a9d0-aaadca040a7c</entry>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <entry name="uuid">a68751cb-f30e-4bcd-a9d0-aaadca040a7c</entry>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     </system>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   <os>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   </os>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   <features>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   </features>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk.config"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:d3:1b:7c"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <target dev="tapbe9c9205-fe"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     </interface>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/console.log" append="off"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <video>
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     </video>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:00:25 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:00:25 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:00:25 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:00:25 compute-0 nova_compute[183177]: </domain>
Jan 26 20:00:25 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.642 183181 DEBUG nova.compute.manager [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Preparing to wait for external event network-vif-plugged-be9c9205-fe76-40b8-9a54-e960c9a57576 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.642 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.642 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.643 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.643 183181 DEBUG nova.virt.libvirt.vif [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:00:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1078324294',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1078324294',id=23,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-zo0ufsle',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:00:20Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=a68751cb-f30e-4bcd-a9d0-aaadca040a7c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be9c9205-fe76-40b8-9a54-e960c9a57576", "address": "fa:16:3e:d3:1b:7c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe9c9205-fe", "ovs_interfaceid": "be9c9205-fe76-40b8-9a54-e960c9a57576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.644 183181 DEBUG nova.network.os_vif_util [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converting VIF {"id": "be9c9205-fe76-40b8-9a54-e960c9a57576", "address": "fa:16:3e:d3:1b:7c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe9c9205-fe", "ovs_interfaceid": "be9c9205-fe76-40b8-9a54-e960c9a57576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.644 183181 DEBUG nova.network.os_vif_util [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:1b:7c,bridge_name='br-int',has_traffic_filtering=True,id=be9c9205-fe76-40b8-9a54-e960c9a57576,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe9c9205-fe') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.645 183181 DEBUG os_vif [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:1b:7c,bridge_name='br-int',has_traffic_filtering=True,id=be9c9205-fe76-40b8-9a54-e960c9a57576,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe9c9205-fe') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.645 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.646 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.646 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.647 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.647 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '99fc19f1-e0b7-55bf-b348-d809877a6c03', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.649 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.651 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.653 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.653 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe9c9205-fe, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.654 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapbe9c9205-fe, col_values=(('qos', UUID('4104309f-ce70-48d9-add3-1249f48bcfc2')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.654 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapbe9c9205-fe, col_values=(('external_ids', {'iface-id': 'be9c9205-fe76-40b8-9a54-e960c9a57576', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:1b:7c', 'vm-uuid': 'a68751cb-f30e-4bcd-a9d0-aaadca040a7c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.655 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:25 compute-0 NetworkManager[55489]: <info>  [1769457625.6567] manager: (tapbe9c9205-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.657 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.667 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:25 compute-0 nova_compute[183177]: 2026-01-26 20:00:25.668 183181 INFO os_vif [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:1b:7c,bridge_name='br-int',has_traffic_filtering=True,id=be9c9205-fe76-40b8-9a54-e960c9a57576,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe9c9205-fe')
Jan 26 20:00:27 compute-0 nova_compute[183177]: 2026-01-26 20:00:27.123 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:27 compute-0 podman[212686]: 2026-01-26 20:00:27.344443814 +0000 UTC m=+0.075103193 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 20:00:27 compute-0 podman[212685]: 2026-01-26 20:00:27.361520479 +0000 UTC m=+0.091954822 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350)
Jan 26 20:00:27 compute-0 nova_compute[183177]: 2026-01-26 20:00:27.384 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:00:27 compute-0 nova_compute[183177]: 2026-01-26 20:00:27.385 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:00:27 compute-0 nova_compute[183177]: 2026-01-26 20:00:27.386 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] No VIF found with MAC fa:16:3e:d3:1b:7c, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 20:00:27 compute-0 nova_compute[183177]: 2026-01-26 20:00:27.387 183181 INFO nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Using config drive
Jan 26 20:00:27 compute-0 podman[212684]: 2026-01-26 20:00:27.39532478 +0000 UTC m=+0.127340625 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller)
Jan 26 20:00:27 compute-0 nova_compute[183177]: 2026-01-26 20:00:27.896 183181 WARNING neutronclient.v2_0.client [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:00:28 compute-0 nova_compute[183177]: 2026-01-26 20:00:28.160 183181 INFO nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Creating config drive at /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk.config
Jan 26 20:00:28 compute-0 nova_compute[183177]: 2026-01-26 20:00:28.171 183181 DEBUG oslo_concurrency.processutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpl3q8s01f execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:00:28 compute-0 nova_compute[183177]: 2026-01-26 20:00:28.316 183181 DEBUG oslo_concurrency.processutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpl3q8s01f" returned: 0 in 0.145s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:00:28 compute-0 kernel: tapbe9c9205-fe: entered promiscuous mode
Jan 26 20:00:28 compute-0 NetworkManager[55489]: <info>  [1769457628.4198] manager: (tapbe9c9205-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Jan 26 20:00:28 compute-0 ovn_controller[95396]: 2026-01-26T20:00:28Z|00170|binding|INFO|Claiming lport be9c9205-fe76-40b8-9a54-e960c9a57576 for this chassis.
Jan 26 20:00:28 compute-0 ovn_controller[95396]: 2026-01-26T20:00:28Z|00171|binding|INFO|be9c9205-fe76-40b8-9a54-e960c9a57576: Claiming fa:16:3e:d3:1b:7c 10.100.0.10
Jan 26 20:00:28 compute-0 nova_compute[183177]: 2026-01-26 20:00:28.422 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.429 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:1b:7c 10.100.0.10'], port_security=['fa:16:3e:d3:1b:7c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a68751cb-f30e-4bcd-a9d0-aaadca040a7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ab7d887b45a437cabdface06e8a9be1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0cb4f9c1-bc93-4265-b41c-06f910025486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee375b04-bdd2-4f20-981a-bcd11f6bf341, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=be9c9205-fe76-40b8-9a54-e960c9a57576) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.430 104672 INFO neutron.agent.ovn.metadata.agent [-] Port be9c9205-fe76-40b8-9a54-e960c9a57576 in datapath d1220de0-89fa-4020-84a8-6d0a816a5b3c bound to our chassis
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.432 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1220de0-89fa-4020-84a8-6d0a816a5b3c
Jan 26 20:00:28 compute-0 ovn_controller[95396]: 2026-01-26T20:00:28Z|00172|binding|INFO|Setting lport be9c9205-fe76-40b8-9a54-e960c9a57576 ovn-installed in OVS
Jan 26 20:00:28 compute-0 ovn_controller[95396]: 2026-01-26T20:00:28Z|00173|binding|INFO|Setting lport be9c9205-fe76-40b8-9a54-e960c9a57576 up in Southbound
Jan 26 20:00:28 compute-0 nova_compute[183177]: 2026-01-26 20:00:28.443 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:28 compute-0 nova_compute[183177]: 2026-01-26 20:00:28.449 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.457 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[692643df-5226-44d1-8423-6207f8395535]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:28 compute-0 systemd-udevd[212768]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:00:28 compute-0 systemd-machined[154465]: New machine qemu-16-instance-00000017.
Jan 26 20:00:28 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000017.
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.489 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[0a0146b4-b0e0-48ef-a966-1346492132ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:28 compute-0 NetworkManager[55489]: <info>  [1769457628.4952] device (tapbe9c9205-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.496 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[7b825723-c940-4a7a-ad80-e2f5044f5c9a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:28 compute-0 NetworkManager[55489]: <info>  [1769457628.4973] device (tapbe9c9205-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.552 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[abcac77c-740c-4d0e-8ae6-f59540d6d2c3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.571 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[586833f1-7f45-4045-ae6b-bb18191791c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1220de0-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:e4:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506448, 'reachable_time': 31878, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212778, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.586 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[064f60be-5839-43f2-aedd-b4ec5bed92cf]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd1220de0-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506463, 'tstamp': 506463}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212780, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd1220de0-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506467, 'tstamp': 506467}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212780, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.587 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1220de0-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:28 compute-0 nova_compute[183177]: 2026-01-26 20:00:28.589 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:28 compute-0 nova_compute[183177]: 2026-01-26 20:00:28.590 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.591 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1220de0-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.591 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.591 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1220de0-80, col_values=(('external_ids', {'iface-id': '7a5d2ac3-15cd-4ece-b5ef-f600fe36ec74'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.592 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:00:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:28.593 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a22a5ee2-b96b-4e7c-9331-077a244edb01]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d1220de0-89fa-4020-84a8-6d0a816a5b3c\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d1220de0-89fa-4020-84a8-6d0a816a5b3c\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:00:29 compute-0 nova_compute[183177]: 2026-01-26 20:00:29.146 183181 DEBUG nova.compute.manager [req-595049e8-251c-46bf-9b5a-3531367cca58 req-c75ed55d-b13c-402d-9c2c-6b3d3171afe2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Received event network-vif-plugged-be9c9205-fe76-40b8-9a54-e960c9a57576 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:00:29 compute-0 nova_compute[183177]: 2026-01-26 20:00:29.146 183181 DEBUG oslo_concurrency.lockutils [req-595049e8-251c-46bf-9b5a-3531367cca58 req-c75ed55d-b13c-402d-9c2c-6b3d3171afe2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:29 compute-0 nova_compute[183177]: 2026-01-26 20:00:29.147 183181 DEBUG oslo_concurrency.lockutils [req-595049e8-251c-46bf-9b5a-3531367cca58 req-c75ed55d-b13c-402d-9c2c-6b3d3171afe2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:29 compute-0 nova_compute[183177]: 2026-01-26 20:00:29.147 183181 DEBUG oslo_concurrency.lockutils [req-595049e8-251c-46bf-9b5a-3531367cca58 req-c75ed55d-b13c-402d-9c2c-6b3d3171afe2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:29 compute-0 nova_compute[183177]: 2026-01-26 20:00:29.147 183181 DEBUG nova.compute.manager [req-595049e8-251c-46bf-9b5a-3531367cca58 req-c75ed55d-b13c-402d-9c2c-6b3d3171afe2 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Processing event network-vif-plugged-be9c9205-fe76-40b8-9a54-e960c9a57576 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:00:29 compute-0 nova_compute[183177]: 2026-01-26 20:00:29.633 183181 DEBUG nova.compute.manager [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:00:29 compute-0 nova_compute[183177]: 2026-01-26 20:00:29.637 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 20:00:29 compute-0 nova_compute[183177]: 2026-01-26 20:00:29.641 183181 INFO nova.virt.libvirt.driver [-] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Instance spawned successfully.
Jan 26 20:00:29 compute-0 nova_compute[183177]: 2026-01-26 20:00:29.641 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 20:00:29 compute-0 podman[192499]: time="2026-01-26T20:00:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:00:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:00:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:00:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:00:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2647 "" "Go-http-client/1.1"
Jan 26 20:00:30 compute-0 nova_compute[183177]: 2026-01-26 20:00:30.158 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:00:30 compute-0 nova_compute[183177]: 2026-01-26 20:00:30.160 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:00:30 compute-0 nova_compute[183177]: 2026-01-26 20:00:30.160 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:00:30 compute-0 nova_compute[183177]: 2026-01-26 20:00:30.161 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:00:30 compute-0 nova_compute[183177]: 2026-01-26 20:00:30.162 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:00:30 compute-0 nova_compute[183177]: 2026-01-26 20:00:30.163 183181 DEBUG nova.virt.libvirt.driver [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:00:30 compute-0 nova_compute[183177]: 2026-01-26 20:00:30.656 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:30 compute-0 nova_compute[183177]: 2026-01-26 20:00:30.676 183181 INFO nova.compute.manager [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Took 9.13 seconds to spawn the instance on the hypervisor.
Jan 26 20:00:30 compute-0 nova_compute[183177]: 2026-01-26 20:00:30.677 183181 DEBUG nova.compute.manager [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 20:00:31 compute-0 nova_compute[183177]: 2026-01-26 20:00:31.206 183181 INFO nova.compute.manager [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Took 14.35 seconds to build instance.
Jan 26 20:00:31 compute-0 nova_compute[183177]: 2026-01-26 20:00:31.238 183181 DEBUG nova.compute.manager [req-121b2f29-db03-4bef-bb27-511fc99efad4 req-1a9b8a29-a66a-4ae6-9b3b-d587f6b50d04 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Received event network-vif-plugged-be9c9205-fe76-40b8-9a54-e960c9a57576 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:00:31 compute-0 nova_compute[183177]: 2026-01-26 20:00:31.238 183181 DEBUG oslo_concurrency.lockutils [req-121b2f29-db03-4bef-bb27-511fc99efad4 req-1a9b8a29-a66a-4ae6-9b3b-d587f6b50d04 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:31 compute-0 nova_compute[183177]: 2026-01-26 20:00:31.239 183181 DEBUG oslo_concurrency.lockutils [req-121b2f29-db03-4bef-bb27-511fc99efad4 req-1a9b8a29-a66a-4ae6-9b3b-d587f6b50d04 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:31 compute-0 nova_compute[183177]: 2026-01-26 20:00:31.239 183181 DEBUG oslo_concurrency.lockutils [req-121b2f29-db03-4bef-bb27-511fc99efad4 req-1a9b8a29-a66a-4ae6-9b3b-d587f6b50d04 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:31 compute-0 nova_compute[183177]: 2026-01-26 20:00:31.239 183181 DEBUG nova.compute.manager [req-121b2f29-db03-4bef-bb27-511fc99efad4 req-1a9b8a29-a66a-4ae6-9b3b-d587f6b50d04 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] No waiting events found dispatching network-vif-plugged-be9c9205-fe76-40b8-9a54-e960c9a57576 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:00:31 compute-0 nova_compute[183177]: 2026-01-26 20:00:31.239 183181 WARNING nova.compute.manager [req-121b2f29-db03-4bef-bb27-511fc99efad4 req-1a9b8a29-a66a-4ae6-9b3b-d587f6b50d04 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Received unexpected event network-vif-plugged-be9c9205-fe76-40b8-9a54-e960c9a57576 for instance with vm_state active and task_state None.
Jan 26 20:00:31 compute-0 openstack_network_exporter[195363]: ERROR   20:00:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:00:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:00:31 compute-0 openstack_network_exporter[195363]: ERROR   20:00:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:00:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:00:31 compute-0 nova_compute[183177]: 2026-01-26 20:00:31.712 183181 DEBUG oslo_concurrency.lockutils [None req-a0856234-beed-401d-b21c-46e7cabab458 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.874s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:32 compute-0 nova_compute[183177]: 2026-01-26 20:00:32.156 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:35 compute-0 podman[212788]: 2026-01-26 20:00:35.315489087 +0000 UTC m=+0.066994736 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 20:00:35 compute-0 nova_compute[183177]: 2026-01-26 20:00:35.659 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:37 compute-0 nova_compute[183177]: 2026-01-26 20:00:37.157 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:38 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 26 20:00:40 compute-0 nova_compute[183177]: 2026-01-26 20:00:40.662 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:42 compute-0 ovn_controller[95396]: 2026-01-26T20:00:42Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d3:1b:7c 10.100.0.10
Jan 26 20:00:42 compute-0 ovn_controller[95396]: 2026-01-26T20:00:42Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:1b:7c 10.100.0.10
Jan 26 20:00:42 compute-0 nova_compute[183177]: 2026-01-26 20:00:42.160 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:45 compute-0 nova_compute[183177]: 2026-01-26 20:00:45.664 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:46 compute-0 nova_compute[183177]: 2026-01-26 20:00:46.688 183181 DEBUG nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Check if temp file /var/lib/nova/instances/tmp2rdd54jb exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 20:00:46 compute-0 nova_compute[183177]: 2026-01-26 20:00:46.693 183181 DEBUG nova.compute.manager [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2rdd54jb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='38add01c-5130-4743-8bfc-a2cd9eef81ef',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 20:00:47 compute-0 nova_compute[183177]: 2026-01-26 20:00:47.163 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:50 compute-0 nova_compute[183177]: 2026-01-26 20:00:50.668 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:51 compute-0 nova_compute[183177]: 2026-01-26 20:00:51.023 183181 DEBUG oslo_concurrency.processutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:00:51 compute-0 nova_compute[183177]: 2026-01-26 20:00:51.083 183181 DEBUG oslo_concurrency.processutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:00:51 compute-0 nova_compute[183177]: 2026-01-26 20:00:51.085 183181 DEBUG oslo_concurrency.processutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:00:51 compute-0 nova_compute[183177]: 2026-01-26 20:00:51.161 183181 DEBUG oslo_concurrency.processutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:00:51 compute-0 nova_compute[183177]: 2026-01-26 20:00:51.162 183181 DEBUG nova.compute.manager [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Preparing to wait for external event network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:00:51 compute-0 nova_compute[183177]: 2026-01-26 20:00:51.163 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:51 compute-0 nova_compute[183177]: 2026-01-26 20:00:51.163 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:51 compute-0 nova_compute[183177]: 2026-01-26 20:00:51.163 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:52 compute-0 nova_compute[183177]: 2026-01-26 20:00:52.165 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:53 compute-0 sshd-session[212846]: Connection closed by authenticating user root 142.93.140.142 port 33074 [preauth]
Jan 26 20:00:55 compute-0 nova_compute[183177]: 2026-01-26 20:00:55.671 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:57 compute-0 nova_compute[183177]: 2026-01-26 20:00:57.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:00:57 compute-0 nova_compute[183177]: 2026-01-26 20:00:57.190 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:57.497 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:00:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:57.499 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:00:57 compute-0 nova_compute[183177]: 2026-01-26 20:00:57.500 183181 DEBUG nova.compute.manager [req-9441d04a-b5e9-4d65-9873-17d4e11396ab req-0db287c8-b920-4fff-ad8d-844cdb8098ff 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received event network-vif-unplugged-471030f9-6931-4461-ad62-0082b9c63094 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:00:57 compute-0 nova_compute[183177]: 2026-01-26 20:00:57.500 183181 DEBUG oslo_concurrency.lockutils [req-9441d04a-b5e9-4d65-9873-17d4e11396ab req-0db287c8-b920-4fff-ad8d-844cdb8098ff 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:57 compute-0 nova_compute[183177]: 2026-01-26 20:00:57.501 183181 DEBUG oslo_concurrency.lockutils [req-9441d04a-b5e9-4d65-9873-17d4e11396ab req-0db287c8-b920-4fff-ad8d-844cdb8098ff 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:57 compute-0 nova_compute[183177]: 2026-01-26 20:00:57.501 183181 DEBUG oslo_concurrency.lockutils [req-9441d04a-b5e9-4d65-9873-17d4e11396ab req-0db287c8-b920-4fff-ad8d-844cdb8098ff 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:57 compute-0 nova_compute[183177]: 2026-01-26 20:00:57.502 183181 DEBUG nova.compute.manager [req-9441d04a-b5e9-4d65-9873-17d4e11396ab req-0db287c8-b920-4fff-ad8d-844cdb8098ff 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] No event matching network-vif-unplugged-471030f9-6931-4461-ad62-0082b9c63094 in dict_keys([('network-vif-plugged', '471030f9-6931-4461-ad62-0082b9c63094')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 20:00:57 compute-0 nova_compute[183177]: 2026-01-26 20:00:57.502 183181 DEBUG nova.compute.manager [req-9441d04a-b5e9-4d65-9873-17d4e11396ab req-0db287c8-b920-4fff-ad8d-844cdb8098ff 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received event network-vif-unplugged-471030f9-6931-4461-ad62-0082b9c63094 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:00:57 compute-0 nova_compute[183177]: 2026-01-26 20:00:57.502 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:00:58 compute-0 podman[212851]: 2026-01-26 20:00:58.374287064 +0000 UTC m=+0.094058348 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 20:00:58 compute-0 podman[212850]: 2026-01-26 20:00:58.376677227 +0000 UTC m=+0.104778633 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 26 20:00:58 compute-0 ovn_controller[95396]: 2026-01-26T20:00:58Z|00174|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 20:00:58 compute-0 podman[212849]: 2026-01-26 20:00:58.456811643 +0000 UTC m=+0.190992052 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 26 20:00:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:00:58.501 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:00:59 compute-0 nova_compute[183177]: 2026-01-26 20:00:59.192 183181 INFO nova.compute.manager [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Took 8.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 26 20:00:59 compute-0 nova_compute[183177]: 2026-01-26 20:00:59.598 183181 DEBUG nova.compute.manager [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received event network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:00:59 compute-0 nova_compute[183177]: 2026-01-26 20:00:59.599 183181 DEBUG oslo_concurrency.lockutils [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:00:59 compute-0 nova_compute[183177]: 2026-01-26 20:00:59.599 183181 DEBUG oslo_concurrency.lockutils [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:00:59 compute-0 nova_compute[183177]: 2026-01-26 20:00:59.600 183181 DEBUG oslo_concurrency.lockutils [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:00:59 compute-0 nova_compute[183177]: 2026-01-26 20:00:59.600 183181 DEBUG nova.compute.manager [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Processing event network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:00:59 compute-0 nova_compute[183177]: 2026-01-26 20:00:59.601 183181 DEBUG nova.compute.manager [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received event network-changed-471030f9-6931-4461-ad62-0082b9c63094 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:00:59 compute-0 nova_compute[183177]: 2026-01-26 20:00:59.602 183181 DEBUG nova.compute.manager [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Refreshing instance network info cache due to event network-changed-471030f9-6931-4461-ad62-0082b9c63094. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:00:59 compute-0 nova_compute[183177]: 2026-01-26 20:00:59.602 183181 DEBUG oslo_concurrency.lockutils [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-38add01c-5130-4743-8bfc-a2cd9eef81ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:00:59 compute-0 nova_compute[183177]: 2026-01-26 20:00:59.603 183181 DEBUG oslo_concurrency.lockutils [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-38add01c-5130-4743-8bfc-a2cd9eef81ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:00:59 compute-0 nova_compute[183177]: 2026-01-26 20:00:59.603 183181 DEBUG nova.network.neutron [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Refreshing network info cache for port 471030f9-6931-4461-ad62-0082b9c63094 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:00:59 compute-0 nova_compute[183177]: 2026-01-26 20:00:59.605 183181 DEBUG nova.compute.manager [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:00:59 compute-0 podman[192499]: time="2026-01-26T20:00:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:00:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:00:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:00:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:00:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2641 "" "Go-http-client/1.1"
Jan 26 20:01:00 compute-0 nova_compute[183177]: 2026-01-26 20:01:00.114 183181 WARNING neutronclient.v2_0.client [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:01:00 compute-0 nova_compute[183177]: 2026-01-26 20:01:00.121 183181 DEBUG nova.compute.manager [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2rdd54jb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='38add01c-5130-4743-8bfc-a2cd9eef81ef',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(0f962fa0-3aaf-4c67-abf0-93be7d2caef5),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 20:01:00 compute-0 nova_compute[183177]: 2026-01-26 20:01:00.648 183181 DEBUG nova.objects.instance [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid 38add01c-5130-4743-8bfc-a2cd9eef81ef obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:01:00 compute-0 nova_compute[183177]: 2026-01-26 20:01:00.649 183181 DEBUG nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 20:01:00 compute-0 nova_compute[183177]: 2026-01-26 20:01:00.651 183181 DEBUG nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:01:00 compute-0 nova_compute[183177]: 2026-01-26 20:01:00.652 183181 DEBUG nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:01:00 compute-0 nova_compute[183177]: 2026-01-26 20:01:00.674 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.154 183181 DEBUG nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.155 183181 DEBUG nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.162 183181 DEBUG nova.virt.libvirt.vif [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:59:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-764172640',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-764172640',id=22,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:00:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-ktquge2x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:00:10Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=38add01c-5130-4743-8bfc-a2cd9eef81ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.162 183181 DEBUG nova.network.os_vif_util [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.163 183181 DEBUG nova.network.os_vif_util [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f7:01,bridge_name='br-int',has_traffic_filtering=True,id=471030f9-6931-4461-ad62-0082b9c63094,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471030f9-69') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.164 183181 DEBUG nova.virt.libvirt.migration [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <mac address="fa:16:3e:7d:f7:01"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <model type="virtio"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <mtu size="1442"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <target dev="tap471030f9-69"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]: </interface>
Jan 26 20:01:01 compute-0 nova_compute[183177]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.165 183181 DEBUG nova.virt.libvirt.migration [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <name>instance-00000016</name>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <uuid>38add01c-5130-4743-8bfc-a2cd9eef81ef</uuid>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-764172640</nova:name>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:00:01</nova:creationTime>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:01:01 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:01:01 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:user uuid="7033feaa27a8427197df3725be3d1a7a">tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin</nova:user>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:project uuid="3ab7d887b45a437cabdface06e8a9be1">tempest-TestExecuteWorkloadBalanceStrategy-1111771467</nova:project>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:port uuid="471030f9-6931-4461-ad62-0082b9c63094">
Jan 26 20:01:01 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <system>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="serial">38add01c-5130-4743-8bfc-a2cd9eef81ef</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="uuid">38add01c-5130-4743-8bfc-a2cd9eef81ef</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </system>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <os>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </os>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <features>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </features>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk.config"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:7d:f7:01"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap471030f9-69"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/console.log" append="off"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </target>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/console.log" append="off"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </console>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </input>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <video>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </video>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]: </domain>
Jan 26 20:01:01 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.166 183181 DEBUG nova.virt.libvirt.migration [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <name>instance-00000016</name>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <uuid>38add01c-5130-4743-8bfc-a2cd9eef81ef</uuid>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-764172640</nova:name>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:00:01</nova:creationTime>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:01:01 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:01:01 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:user uuid="7033feaa27a8427197df3725be3d1a7a">tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin</nova:user>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:project uuid="3ab7d887b45a437cabdface06e8a9be1">tempest-TestExecuteWorkloadBalanceStrategy-1111771467</nova:project>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:port uuid="471030f9-6931-4461-ad62-0082b9c63094">
Jan 26 20:01:01 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <system>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="serial">38add01c-5130-4743-8bfc-a2cd9eef81ef</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="uuid">38add01c-5130-4743-8bfc-a2cd9eef81ef</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </system>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <os>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </os>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <features>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </features>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk.config"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:7d:f7:01"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap471030f9-69"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/console.log" append="off"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </target>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/console.log" append="off"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </console>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </input>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <video>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </video>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]: </domain>
Jan 26 20:01:01 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.167 183181 DEBUG nova.virt.libvirt.migration [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <name>instance-00000016</name>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <uuid>38add01c-5130-4743-8bfc-a2cd9eef81ef</uuid>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-764172640</nova:name>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:00:01</nova:creationTime>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:01:01 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:01:01 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:user uuid="7033feaa27a8427197df3725be3d1a7a">tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin</nova:user>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:project uuid="3ab7d887b45a437cabdface06e8a9be1">tempest-TestExecuteWorkloadBalanceStrategy-1111771467</nova:project>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <nova:port uuid="471030f9-6931-4461-ad62-0082b9c63094">
Jan 26 20:01:01 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <system>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="serial">38add01c-5130-4743-8bfc-a2cd9eef81ef</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="uuid">38add01c-5130-4743-8bfc-a2cd9eef81ef</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </system>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <os>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </os>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <features>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </features>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/disk.config"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:7d:f7:01"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap471030f9-69"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/console.log" append="off"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:01:01 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       </target>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef/console.log" append="off"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </console>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </input>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <video>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </video>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:01:01 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:01:01 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:01:01 compute-0 nova_compute[183177]: </domain>
Jan 26 20:01:01 compute-0 nova_compute[183177]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.168 183181 DEBUG nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 20:01:01 compute-0 openstack_network_exporter[195363]: ERROR   20:01:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:01:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:01:01 compute-0 openstack_network_exporter[195363]: ERROR   20:01:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:01:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.658 183181 DEBUG nova.virt.libvirt.migration [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.659 183181 INFO nova.virt.libvirt.migration [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 20:01:01 compute-0 CROND[212912]: (root) CMD (run-parts /etc/cron.hourly)
Jan 26 20:01:01 compute-0 run-parts[212915]: (/etc/cron.hourly) starting 0anacron
Jan 26 20:01:01 compute-0 run-parts[212921]: (/etc/cron.hourly) finished 0anacron
Jan 26 20:01:01 compute-0 CROND[212911]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 26 20:01:01 compute-0 nova_compute[183177]: 2026-01-26 20:01:01.948 183181 WARNING neutronclient.v2_0.client [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:01:02 compute-0 nova_compute[183177]: 2026-01-26 20:01:02.193 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:02 compute-0 nova_compute[183177]: 2026-01-26 20:01:02.388 183181 DEBUG nova.network.neutron [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Updated VIF entry in instance network info cache for port 471030f9-6931-4461-ad62-0082b9c63094. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 20:01:02 compute-0 nova_compute[183177]: 2026-01-26 20:01:02.389 183181 DEBUG nova.network.neutron [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Updating instance_info_cache with network_info: [{"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:01:02 compute-0 nova_compute[183177]: 2026-01-26 20:01:02.692 183181 INFO nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 20:01:02 compute-0 nova_compute[183177]: 2026-01-26 20:01:02.896 183181 DEBUG oslo_concurrency.lockutils [req-9f2c5f9a-6a49-4628-8fde-7a748c3eec80 req-ac974d63-20d7-4b3d-b58a-e85221de676a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-38add01c-5130-4743-8bfc-a2cd9eef81ef" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:01:03 compute-0 nova_compute[183177]: 2026-01-26 20:01:03.196 183181 DEBUG nova.virt.libvirt.migration [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:01:03 compute-0 nova_compute[183177]: 2026-01-26 20:01:03.197 183181 DEBUG nova.virt.libvirt.migration [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 20:01:03 compute-0 sshd-session[212922]: Connection closed by authenticating user root 188.166.116.149 port 42856 [preauth]
Jan 26 20:01:03 compute-0 nova_compute[183177]: 2026-01-26 20:01:03.714 183181 DEBUG nova.virt.libvirt.migration [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:01:03 compute-0 nova_compute[183177]: 2026-01-26 20:01:03.715 183181 DEBUG nova.virt.libvirt.migration [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.220 183181 DEBUG nova.virt.libvirt.migration [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.220 183181 DEBUG nova.virt.libvirt.migration [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 20:01:04 compute-0 kernel: tap471030f9-69 (unregistering): left promiscuous mode
Jan 26 20:01:04 compute-0 NetworkManager[55489]: <info>  [1769457664.2407] device (tap471030f9-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.246 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:04 compute-0 ovn_controller[95396]: 2026-01-26T20:01:04Z|00175|binding|INFO|Releasing lport 471030f9-6931-4461-ad62-0082b9c63094 from this chassis (sb_readonly=0)
Jan 26 20:01:04 compute-0 ovn_controller[95396]: 2026-01-26T20:01:04Z|00176|binding|INFO|Setting lport 471030f9-6931-4461-ad62-0082b9c63094 down in Southbound
Jan 26 20:01:04 compute-0 ovn_controller[95396]: 2026-01-26T20:01:04Z|00177|binding|INFO|Removing iface tap471030f9-69 ovn-installed in OVS
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.250 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.259 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:f7:01 10.100.0.3'], port_security=['fa:16:3e:7d:f7:01 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c86a094a-ada8-46ad-9c23-c857e9a7b834'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '38add01c-5130-4743-8bfc-a2cd9eef81ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ab7d887b45a437cabdface06e8a9be1', 'neutron:revision_number': '10', 'neutron:security_group_ids': '0cb4f9c1-bc93-4265-b41c-06f910025486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee375b04-bdd2-4f20-981a-bcd11f6bf341, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=471030f9-6931-4461-ad62-0082b9c63094) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.260 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 471030f9-6931-4461-ad62-0082b9c63094 in datapath d1220de0-89fa-4020-84a8-6d0a816a5b3c unbound from our chassis
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.261 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1220de0-89fa-4020-84a8-6d0a816a5b3c
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.271 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.286 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[7261c97a-8363-4f90-835b-a7d1619dfa46]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:04 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000016.scope: Deactivated successfully.
Jan 26 20:01:04 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000016.scope: Consumed 15.822s CPU time.
Jan 26 20:01:04 compute-0 systemd-machined[154465]: Machine qemu-15-instance-00000016 terminated.
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.329 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[d4168a54-2bb9-4c60-99c0-b7ee546ffd61]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.332 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[e0e7dfc3-e4ca-4e4a-a8f6-4e19a1f270a4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.376 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[5ea3eab7-af01-4a4d-8569-3d0b7d0f71c1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.402 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[fcc79573-787c-416d-9d5b-f8b4b4649b42]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1220de0-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:e4:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506448, 'reachable_time': 31878, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212950, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.426 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[dab04bee-3c73-4429-85be-67c42bb53c22]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd1220de0-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506463, 'tstamp': 506463}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212951, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd1220de0-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506467, 'tstamp': 506467}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212951, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.427 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1220de0-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.429 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.434 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.434 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1220de0-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.434 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.435 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1220de0-80, col_values=(('external_ids', {'iface-id': '7a5d2ac3-15cd-4ece-b5ef-f600fe36ec74'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.435 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:01:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:04.437 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9672c5ca-257f-431d-a403-89b2df23c178]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d1220de0-89fa-4020-84a8-6d0a816a5b3c\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d1220de0-89fa-4020-84a8-6d0a816a5b3c\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.507 183181 DEBUG nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.508 183181 DEBUG nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.508 183181 DEBUG nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.726 183181 DEBUG nova.virt.libvirt.guest [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '38add01c-5130-4743-8bfc-a2cd9eef81ef' (instance-00000016) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.727 183181 INFO nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Migration operation has completed
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.727 183181 INFO nova.compute.manager [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] _post_live_migration() is started..
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.749 183181 WARNING neutronclient.v2_0.client [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:01:04 compute-0 nova_compute[183177]: 2026-01-26 20:01:04.750 183181 WARNING neutronclient.v2_0.client [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:01:05 compute-0 nova_compute[183177]: 2026-01-26 20:01:05.298 183181 DEBUG nova.compute.manager [req-4c9cb847-9a9a-4fce-bbf2-f9ad213fcfe2 req-0130abeb-7321-41ab-9297-d47d5117ca0d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received event network-vif-unplugged-471030f9-6931-4461-ad62-0082b9c63094 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:01:05 compute-0 nova_compute[183177]: 2026-01-26 20:01:05.299 183181 DEBUG oslo_concurrency.lockutils [req-4c9cb847-9a9a-4fce-bbf2-f9ad213fcfe2 req-0130abeb-7321-41ab-9297-d47d5117ca0d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:05 compute-0 nova_compute[183177]: 2026-01-26 20:01:05.299 183181 DEBUG oslo_concurrency.lockutils [req-4c9cb847-9a9a-4fce-bbf2-f9ad213fcfe2 req-0130abeb-7321-41ab-9297-d47d5117ca0d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:05 compute-0 nova_compute[183177]: 2026-01-26 20:01:05.300 183181 DEBUG oslo_concurrency.lockutils [req-4c9cb847-9a9a-4fce-bbf2-f9ad213fcfe2 req-0130abeb-7321-41ab-9297-d47d5117ca0d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:05 compute-0 nova_compute[183177]: 2026-01-26 20:01:05.300 183181 DEBUG nova.compute.manager [req-4c9cb847-9a9a-4fce-bbf2-f9ad213fcfe2 req-0130abeb-7321-41ab-9297-d47d5117ca0d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] No waiting events found dispatching network-vif-unplugged-471030f9-6931-4461-ad62-0082b9c63094 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:01:05 compute-0 nova_compute[183177]: 2026-01-26 20:01:05.300 183181 DEBUG nova.compute.manager [req-4c9cb847-9a9a-4fce-bbf2-f9ad213fcfe2 req-0130abeb-7321-41ab-9297-d47d5117ca0d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received event network-vif-unplugged-471030f9-6931-4461-ad62-0082b9c63094 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:01:05 compute-0 nova_compute[183177]: 2026-01-26 20:01:05.677 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:05 compute-0 sshd-session[212970]: Connection closed by authenticating user root 193.32.162.151 port 43470 [preauth]
Jan 26 20:01:06 compute-0 nova_compute[183177]: 2026-01-26 20:01:06.151 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:01:06 compute-0 nova_compute[183177]: 2026-01-26 20:01:06.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:01:06 compute-0 podman[212972]: 2026-01-26 20:01:06.356645038 +0000 UTC m=+0.095978529 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 20:01:06 compute-0 nova_compute[183177]: 2026-01-26 20:01:06.670 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:06 compute-0 nova_compute[183177]: 2026-01-26 20:01:06.671 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:06 compute-0 nova_compute[183177]: 2026-01-26 20:01:06.671 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:06 compute-0 nova_compute[183177]: 2026-01-26 20:01:06.671 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.104 183181 DEBUG nova.network.neutron [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Activated binding for port 471030f9-6931-4461-ad62-0082b9c63094 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.105 183181 DEBUG nova.compute.manager [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.107 183181 DEBUG nova.virt.libvirt.vif [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T19:59:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-764172640',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-764172640',id=22,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:00:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-ktquge2x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:00:42Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=38add01c-5130-4743-8bfc-a2cd9eef81ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.107 183181 DEBUG nova.network.os_vif_util [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "471030f9-6931-4461-ad62-0082b9c63094", "address": "fa:16:3e:7d:f7:01", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap471030f9-69", "ovs_interfaceid": "471030f9-6931-4461-ad62-0082b9c63094", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.108 183181 DEBUG nova.network.os_vif_util [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f7:01,bridge_name='br-int',has_traffic_filtering=True,id=471030f9-6931-4461-ad62-0082b9c63094,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471030f9-69') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.109 183181 DEBUG os_vif [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f7:01,bridge_name='br-int',has_traffic_filtering=True,id=471030f9-6931-4461-ad62-0082b9c63094,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471030f9-69') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.112 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.113 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap471030f9-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.114 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.116 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.118 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.118 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=370d3c70-ed37-41ac-ab01-e2391c034d05) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.119 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.120 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.123 183181 INFO os_vif [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f7:01,bridge_name='br-int',has_traffic_filtering=True,id=471030f9-6931-4461-ad62-0082b9c63094,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap471030f9-69')
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.124 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.124 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.125 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.125 183181 DEBUG nova.compute.manager [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.126 183181 INFO nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Deleting instance files /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef_del
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.127 183181 INFO nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Deletion of /var/lib/nova/instances/38add01c-5130-4743-8bfc-a2cd9eef81ef_del complete
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.196 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.357 183181 DEBUG nova.compute.manager [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received event network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.357 183181 DEBUG oslo_concurrency.lockutils [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.358 183181 DEBUG oslo_concurrency.lockutils [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.358 183181 DEBUG oslo_concurrency.lockutils [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.358 183181 DEBUG nova.compute.manager [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] No waiting events found dispatching network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.359 183181 WARNING nova.compute.manager [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received unexpected event network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 for instance with vm_state active and task_state migrating.
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.359 183181 DEBUG nova.compute.manager [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received event network-vif-unplugged-471030f9-6931-4461-ad62-0082b9c63094 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.359 183181 DEBUG oslo_concurrency.lockutils [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.360 183181 DEBUG oslo_concurrency.lockutils [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.360 183181 DEBUG oslo_concurrency.lockutils [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.360 183181 DEBUG nova.compute.manager [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] No waiting events found dispatching network-vif-unplugged-471030f9-6931-4461-ad62-0082b9c63094 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.361 183181 DEBUG nova.compute.manager [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received event network-vif-unplugged-471030f9-6931-4461-ad62-0082b9c63094 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.361 183181 DEBUG nova.compute.manager [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received event network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.361 183181 DEBUG oslo_concurrency.lockutils [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.362 183181 DEBUG oslo_concurrency.lockutils [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.362 183181 DEBUG oslo_concurrency.lockutils [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.362 183181 DEBUG nova.compute.manager [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] No waiting events found dispatching network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.363 183181 WARNING nova.compute.manager [req-47c2f8ab-afd0-4ff6-be10-88fdec89b748 req-78c0c0fa-a10b-450c-a33e-5f6473db08ab 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Received unexpected event network-vif-plugged-471030f9-6931-4461-ad62-0082b9c63094 for instance with vm_state active and task_state migrating.
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.734 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.798 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.799 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:01:07 compute-0 nova_compute[183177]: 2026-01-26 20:01:07.893 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:01:08 compute-0 nova_compute[183177]: 2026-01-26 20:01:08.111 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:01:08 compute-0 nova_compute[183177]: 2026-01-26 20:01:08.112 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:01:08 compute-0 nova_compute[183177]: 2026-01-26 20:01:08.155 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:01:08 compute-0 nova_compute[183177]: 2026-01-26 20:01:08.155 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5530MB free_disk=73.04085922241211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:01:08 compute-0 nova_compute[183177]: 2026-01-26 20:01:08.155 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:08 compute-0 nova_compute[183177]: 2026-01-26 20:01:08.156 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:09 compute-0 nova_compute[183177]: 2026-01-26 20:01:09.184 183181 INFO nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Updating resource usage from migration 0f962fa0-3aaf-4c67-abf0-93be7d2caef5
Jan 26 20:01:09 compute-0 nova_compute[183177]: 2026-01-26 20:01:09.212 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance a68751cb-f30e-4bcd-a9d0-aaadca040a7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 20:01:09 compute-0 nova_compute[183177]: 2026-01-26 20:01:09.212 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Migration 0f962fa0-3aaf-4c67-abf0-93be7d2caef5 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:01:09 compute-0 nova_compute[183177]: 2026-01-26 20:01:09.213 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:01:09 compute-0 nova_compute[183177]: 2026-01-26 20:01:09.214 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:01:08 up  1:25,  0 user,  load average: 0.20, 0.22, 0.27\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_migrating': '1', 'num_os_type_None': '2', 'num_proj_3ab7d887b45a437cabdface06e8a9be1': '2', 'io_workload': '0', 'num_task_None': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:01:09 compute-0 nova_compute[183177]: 2026-01-26 20:01:09.292 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:01:09 compute-0 nova_compute[183177]: 2026-01-26 20:01:09.819 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:01:10 compute-0 nova_compute[183177]: 2026-01-26 20:01:10.328 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:01:10 compute-0 nova_compute[183177]: 2026-01-26 20:01:10.329 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.173s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:12 compute-0 nova_compute[183177]: 2026-01-26 20:01:12.121 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:12 compute-0 nova_compute[183177]: 2026-01-26 20:01:12.200 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:13 compute-0 nova_compute[183177]: 2026-01-26 20:01:13.332 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:01:13 compute-0 nova_compute[183177]: 2026-01-26 20:01:13.332 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:01:13 compute-0 nova_compute[183177]: 2026-01-26 20:01:13.333 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:01:13 compute-0 nova_compute[183177]: 2026-01-26 20:01:13.333 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:01:13 compute-0 nova_compute[183177]: 2026-01-26 20:01:13.333 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:01:15 compute-0 nova_compute[183177]: 2026-01-26 20:01:15.671 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:15 compute-0 nova_compute[183177]: 2026-01-26 20:01:15.672 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:15 compute-0 nova_compute[183177]: 2026-01-26 20:01:15.673 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "38add01c-5130-4743-8bfc-a2cd9eef81ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:16 compute-0 nova_compute[183177]: 2026-01-26 20:01:16.189 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:16 compute-0 nova_compute[183177]: 2026-01-26 20:01:16.189 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:16 compute-0 nova_compute[183177]: 2026-01-26 20:01:16.190 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:16 compute-0 nova_compute[183177]: 2026-01-26 20:01:16.190 183181 DEBUG nova.compute.resource_tracker [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:01:17 compute-0 nova_compute[183177]: 2026-01-26 20:01:17.137 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:17 compute-0 nova_compute[183177]: 2026-01-26 20:01:17.201 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:17 compute-0 nova_compute[183177]: 2026-01-26 20:01:17.236 183181 DEBUG oslo_concurrency.processutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:01:17 compute-0 nova_compute[183177]: 2026-01-26 20:01:17.327 183181 DEBUG oslo_concurrency.processutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:01:17 compute-0 nova_compute[183177]: 2026-01-26 20:01:17.328 183181 DEBUG oslo_concurrency.processutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:01:17 compute-0 nova_compute[183177]: 2026-01-26 20:01:17.378 183181 DEBUG oslo_concurrency.processutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:01:17 compute-0 nova_compute[183177]: 2026-01-26 20:01:17.575 183181 WARNING nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:01:17 compute-0 nova_compute[183177]: 2026-01-26 20:01:17.578 183181 DEBUG oslo_concurrency.processutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:01:17 compute-0 nova_compute[183177]: 2026-01-26 20:01:17.607 183181 DEBUG oslo_concurrency.processutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:01:17 compute-0 nova_compute[183177]: 2026-01-26 20:01:17.609 183181 DEBUG nova.compute.resource_tracker [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5540MB free_disk=73.06943893432617GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:01:17 compute-0 nova_compute[183177]: 2026-01-26 20:01:17.609 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:17 compute-0 nova_compute[183177]: 2026-01-26 20:01:17.610 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:18 compute-0 nova_compute[183177]: 2026-01-26 20:01:18.642 183181 DEBUG nova.compute.resource_tracker [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance 38add01c-5130-4743-8bfc-a2cd9eef81ef refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 20:01:19 compute-0 nova_compute[183177]: 2026-01-26 20:01:19.158 183181 DEBUG nova.compute.resource_tracker [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 20:01:19 compute-0 nova_compute[183177]: 2026-01-26 20:01:19.195 183181 DEBUG nova.compute.resource_tracker [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Instance a68751cb-f30e-4bcd-a9d0-aaadca040a7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 20:01:19 compute-0 nova_compute[183177]: 2026-01-26 20:01:19.195 183181 DEBUG nova.compute.resource_tracker [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration 0f962fa0-3aaf-4c67-abf0-93be7d2caef5 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:01:19 compute-0 nova_compute[183177]: 2026-01-26 20:01:19.196 183181 DEBUG nova.compute.resource_tracker [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:01:19 compute-0 nova_compute[183177]: 2026-01-26 20:01:19.196 183181 DEBUG nova.compute.resource_tracker [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:01:17 up  1:25,  0 user,  load average: 0.17, 0.22, 0.27\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_3ab7d887b45a437cabdface06e8a9be1': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:01:19 compute-0 nova_compute[183177]: 2026-01-26 20:01:19.268 183181 DEBUG nova.compute.provider_tree [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:01:19 compute-0 nova_compute[183177]: 2026-01-26 20:01:19.776 183181 DEBUG nova.scheduler.client.report [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:01:20 compute-0 nova_compute[183177]: 2026-01-26 20:01:20.290 183181 DEBUG nova.compute.resource_tracker [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:01:20 compute-0 nova_compute[183177]: 2026-01-26 20:01:20.290 183181 DEBUG oslo_concurrency.lockutils [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.681s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:20 compute-0 nova_compute[183177]: 2026-01-26 20:01:20.317 183181 INFO nova.compute.manager [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 26 20:01:21 compute-0 nova_compute[183177]: 2026-01-26 20:01:21.437 183181 INFO nova.scheduler.client.report [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration 0f962fa0-3aaf-4c67-abf0-93be7d2caef5
Jan 26 20:01:21 compute-0 nova_compute[183177]: 2026-01-26 20:01:21.438 183181 DEBUG nova.virt.libvirt.driver [None req-393f7b57-6de7-416a-a2d7-926cf2b2a4e8 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 38add01c-5130-4743-8bfc-a2cd9eef81ef] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 20:01:21 compute-0 nova_compute[183177]: 2026-01-26 20:01:21.721 183181 DEBUG oslo_concurrency.lockutils [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:21 compute-0 nova_compute[183177]: 2026-01-26 20:01:21.721 183181 DEBUG oslo_concurrency.lockutils [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:21 compute-0 nova_compute[183177]: 2026-01-26 20:01:21.722 183181 DEBUG oslo_concurrency.lockutils [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:21 compute-0 nova_compute[183177]: 2026-01-26 20:01:21.722 183181 DEBUG oslo_concurrency.lockutils [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:21 compute-0 nova_compute[183177]: 2026-01-26 20:01:21.722 183181 DEBUG oslo_concurrency.lockutils [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:21 compute-0 nova_compute[183177]: 2026-01-26 20:01:21.733 183181 INFO nova.compute.manager [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Terminating instance
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.177 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.203 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.260 183181 DEBUG nova.compute.manager [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 20:01:22 compute-0 kernel: tapbe9c9205-fe (unregistering): left promiscuous mode
Jan 26 20:01:22 compute-0 NetworkManager[55489]: <info>  [1769457682.2880] device (tapbe9c9205-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.290 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:22 compute-0 ovn_controller[95396]: 2026-01-26T20:01:22Z|00178|binding|INFO|Releasing lport be9c9205-fe76-40b8-9a54-e960c9a57576 from this chassis (sb_readonly=0)
Jan 26 20:01:22 compute-0 ovn_controller[95396]: 2026-01-26T20:01:22Z|00179|binding|INFO|Setting lport be9c9205-fe76-40b8-9a54-e960c9a57576 down in Southbound
Jan 26 20:01:22 compute-0 ovn_controller[95396]: 2026-01-26T20:01:22Z|00180|binding|INFO|Removing iface tapbe9c9205-fe ovn-installed in OVS
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.303 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:1b:7c 10.100.0.10'], port_security=['fa:16:3e:d3:1b:7c 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a68751cb-f30e-4bcd-a9d0-aaadca040a7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ab7d887b45a437cabdface06e8a9be1', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0cb4f9c1-bc93-4265-b41c-06f910025486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee375b04-bdd2-4f20-981a-bcd11f6bf341, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=be9c9205-fe76-40b8-9a54-e960c9a57576) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.305 104672 INFO neutron.agent.ovn.metadata.agent [-] Port be9c9205-fe76-40b8-9a54-e960c9a57576 in datapath d1220de0-89fa-4020-84a8-6d0a816a5b3c unbound from our chassis
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.306 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1220de0-89fa-4020-84a8-6d0a816a5b3c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.307 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[0180bb12-1dbf-4086-91d6-93ef94021ec2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.308 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c namespace which is not needed anymore
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.325 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:22 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000017.scope: Deactivated successfully.
Jan 26 20:01:22 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000017.scope: Consumed 15.369s CPU time.
Jan 26 20:01:22 compute-0 systemd-machined[154465]: Machine qemu-16-instance-00000017 terminated.
Jan 26 20:01:22 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[212627]: [NOTICE]   (212632) : haproxy version is 3.0.5-8e879a5
Jan 26 20:01:22 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[212627]: [NOTICE]   (212632) : path to executable is /usr/sbin/haproxy
Jan 26 20:01:22 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[212627]: [WARNING]  (212632) : Exiting Master process...
Jan 26 20:01:22 compute-0 podman[213039]: 2026-01-26 20:01:22.459596235 +0000 UTC m=+0.029640141 container kill 4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 20:01:22 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[212627]: [ALERT]    (212632) : Current worker (212634) exited with code 143 (Terminated)
Jan 26 20:01:22 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[212627]: [WARNING]  (212632) : All workers exited. Exiting... (0)
Jan 26 20:01:22 compute-0 systemd[1]: libpod-4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808.scope: Deactivated successfully.
Jan 26 20:01:22 compute-0 podman[213056]: 2026-01-26 20:01:22.516340846 +0000 UTC m=+0.027776691 container died 4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.530 183181 INFO nova.virt.libvirt.driver [-] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Instance destroyed successfully.
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.530 183181 DEBUG nova.objects.instance [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lazy-loading 'resources' on Instance uuid a68751cb-f30e-4bcd-a9d0-aaadca040a7c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:01:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808-userdata-shm.mount: Deactivated successfully.
Jan 26 20:01:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-e931d3a7067adc7f52a029288188d6ff7bc8ae757e28b957efc7697ad8496d9e-merged.mount: Deactivated successfully.
Jan 26 20:01:22 compute-0 podman[213056]: 2026-01-26 20:01:22.5581168 +0000 UTC m=+0.069552625 container cleanup 4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Jan 26 20:01:22 compute-0 systemd[1]: libpod-conmon-4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808.scope: Deactivated successfully.
Jan 26 20:01:22 compute-0 podman[213059]: 2026-01-26 20:01:22.576191032 +0000 UTC m=+0.086067365 container remove 4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.581 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[629b958e-9da4-49e0-8a2c-810a0b523203]: (4, ("Mon Jan 26 08:01:22 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c (4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808)\n4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808\nMon Jan 26 08:01:22 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c (4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808)\n4af70339a62af5053d6978899bc00110cdce32ab019393b0497644bb42f56808\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.583 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[58702f09-18fd-4b90-9e96-42774d23bb85]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.583 183181 DEBUG nova.compute.manager [req-44b9a832-dfde-4724-aede-9ced4b85301c req-b92e22d1-11d4-4af7-bacb-c78bf9c1f612 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Received event network-vif-unplugged-be9c9205-fe76-40b8-9a54-e960c9a57576 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.583 183181 DEBUG oslo_concurrency.lockutils [req-44b9a832-dfde-4724-aede-9ced4b85301c req-b92e22d1-11d4-4af7-bacb-c78bf9c1f612 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.583 183181 DEBUG oslo_concurrency.lockutils [req-44b9a832-dfde-4724-aede-9ced4b85301c req-b92e22d1-11d4-4af7-bacb-c78bf9c1f612 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.583 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.584 183181 DEBUG oslo_concurrency.lockutils [req-44b9a832-dfde-4724-aede-9ced4b85301c req-b92e22d1-11d4-4af7-bacb-c78bf9c1f612 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.584 183181 DEBUG nova.compute.manager [req-44b9a832-dfde-4724-aede-9ced4b85301c req-b92e22d1-11d4-4af7-bacb-c78bf9c1f612 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] No waiting events found dispatching network-vif-unplugged-be9c9205-fe76-40b8-9a54-e960c9a57576 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.584 183181 DEBUG nova.compute.manager [req-44b9a832-dfde-4724-aede-9ced4b85301c req-b92e22d1-11d4-4af7-bacb-c78bf9c1f612 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Received event network-vif-unplugged-be9c9205-fe76-40b8-9a54-e960c9a57576 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.584 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[81fd01a3-06ae-46d5-bf87-f8bd991ebc55]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.585 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1220de0-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:22 compute-0 kernel: tapd1220de0-80: left promiscuous mode
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.586 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.600 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:22 compute-0 nova_compute[183177]: 2026-01-26 20:01:22.601 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.603 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f85b1065-e760-4b91-8cac-1e65d48f6147]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.615 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[4d95d2ac-b50a-4729-8c80-f871982a9261]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.616 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d8c12ca3-0a49-4631-9886-c92c9f4f1714]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.630 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[32f2df98-6edc-4ede-a4a5-2ab7775001ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506441, 'reachable_time': 36879, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213108, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:22 compute-0 systemd[1]: run-netns-ovnmeta\x2dd1220de0\x2d89fa\x2d4020\x2d84a8\x2d6d0a816a5b3c.mount: Deactivated successfully.
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.634 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 20:01:22 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:22.634 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[d200f24a-1ba2-4025-a314-87a759caba59]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.037 183181 DEBUG nova.virt.libvirt.vif [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:00:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1078324294',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1078324294',id=23,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:00:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-zo0ufsle',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:00:30Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=a68751cb-f30e-4bcd-a9d0-aaadca040a7c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "be9c9205-fe76-40b8-9a54-e960c9a57576", "address": "fa:16:3e:d3:1b:7c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe9c9205-fe", "ovs_interfaceid": "be9c9205-fe76-40b8-9a54-e960c9a57576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.037 183181 DEBUG nova.network.os_vif_util [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converting VIF {"id": "be9c9205-fe76-40b8-9a54-e960c9a57576", "address": "fa:16:3e:d3:1b:7c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe9c9205-fe", "ovs_interfaceid": "be9c9205-fe76-40b8-9a54-e960c9a57576", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.039 183181 DEBUG nova.network.os_vif_util [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:1b:7c,bridge_name='br-int',has_traffic_filtering=True,id=be9c9205-fe76-40b8-9a54-e960c9a57576,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe9c9205-fe') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.040 183181 DEBUG os_vif [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:1b:7c,bridge_name='br-int',has_traffic_filtering=True,id=be9c9205-fe76-40b8-9a54-e960c9a57576,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe9c9205-fe') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.044 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.045 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe9c9205-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.046 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.048 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.048 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.049 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4104309f-ce70-48d9-add3-1249f48bcfc2) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.049 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.050 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.053 183181 INFO os_vif [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:1b:7c,bridge_name='br-int',has_traffic_filtering=True,id=be9c9205-fe76-40b8-9a54-e960c9a57576,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe9c9205-fe')
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.053 183181 INFO nova.virt.libvirt.driver [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Deleting instance files /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c_del
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.054 183181 INFO nova.virt.libvirt.driver [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Deletion of /var/lib/nova/instances/a68751cb-f30e-4bcd-a9d0-aaadca040a7c_del complete
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.570 183181 INFO nova.compute.manager [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Took 1.31 seconds to destroy the instance on the hypervisor.
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.572 183181 DEBUG oslo.service.backend._eventlet.loopingcall [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.572 183181 DEBUG nova.compute.manager [-] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.573 183181 DEBUG nova.network.neutron [-] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.573 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:01:23 compute-0 nova_compute[183177]: 2026-01-26 20:01:23.811 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:01:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:24.083 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:24.084 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:24.084 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:24 compute-0 nova_compute[183177]: 2026-01-26 20:01:24.663 183181 DEBUG nova.compute.manager [req-6d25e6f7-67a7-4473-b77e-31253d134b31 req-6be2b341-0b34-4b60-8a4a-b8bd47e213b1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Received event network-vif-unplugged-be9c9205-fe76-40b8-9a54-e960c9a57576 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:01:24 compute-0 nova_compute[183177]: 2026-01-26 20:01:24.663 183181 DEBUG oslo_concurrency.lockutils [req-6d25e6f7-67a7-4473-b77e-31253d134b31 req-6be2b341-0b34-4b60-8a4a-b8bd47e213b1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:24 compute-0 nova_compute[183177]: 2026-01-26 20:01:24.664 183181 DEBUG oslo_concurrency.lockutils [req-6d25e6f7-67a7-4473-b77e-31253d134b31 req-6be2b341-0b34-4b60-8a4a-b8bd47e213b1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:24 compute-0 nova_compute[183177]: 2026-01-26 20:01:24.665 183181 DEBUG oslo_concurrency.lockutils [req-6d25e6f7-67a7-4473-b77e-31253d134b31 req-6be2b341-0b34-4b60-8a4a-b8bd47e213b1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:24 compute-0 nova_compute[183177]: 2026-01-26 20:01:24.665 183181 DEBUG nova.compute.manager [req-6d25e6f7-67a7-4473-b77e-31253d134b31 req-6be2b341-0b34-4b60-8a4a-b8bd47e213b1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] No waiting events found dispatching network-vif-unplugged-be9c9205-fe76-40b8-9a54-e960c9a57576 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:01:24 compute-0 nova_compute[183177]: 2026-01-26 20:01:24.666 183181 DEBUG nova.compute.manager [req-6d25e6f7-67a7-4473-b77e-31253d134b31 req-6be2b341-0b34-4b60-8a4a-b8bd47e213b1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Received event network-vif-unplugged-be9c9205-fe76-40b8-9a54-e960c9a57576 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:01:24 compute-0 nova_compute[183177]: 2026-01-26 20:01:24.666 183181 DEBUG nova.compute.manager [req-6d25e6f7-67a7-4473-b77e-31253d134b31 req-6be2b341-0b34-4b60-8a4a-b8bd47e213b1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Received event network-vif-deleted-be9c9205-fe76-40b8-9a54-e960c9a57576 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:01:24 compute-0 nova_compute[183177]: 2026-01-26 20:01:24.666 183181 INFO nova.compute.manager [req-6d25e6f7-67a7-4473-b77e-31253d134b31 req-6be2b341-0b34-4b60-8a4a-b8bd47e213b1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Neutron deleted interface be9c9205-fe76-40b8-9a54-e960c9a57576; detaching it from the instance and deleting it from the info cache
Jan 26 20:01:24 compute-0 nova_compute[183177]: 2026-01-26 20:01:24.667 183181 DEBUG nova.network.neutron [req-6d25e6f7-67a7-4473-b77e-31253d134b31 req-6be2b341-0b34-4b60-8a4a-b8bd47e213b1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:01:24 compute-0 nova_compute[183177]: 2026-01-26 20:01:24.901 183181 DEBUG nova.network.neutron [-] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:01:25 compute-0 nova_compute[183177]: 2026-01-26 20:01:25.175 183181 DEBUG nova.compute.manager [req-6d25e6f7-67a7-4473-b77e-31253d134b31 req-6be2b341-0b34-4b60-8a4a-b8bd47e213b1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Detach interface failed, port_id=be9c9205-fe76-40b8-9a54-e960c9a57576, reason: Instance a68751cb-f30e-4bcd-a9d0-aaadca040a7c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 20:01:25 compute-0 nova_compute[183177]: 2026-01-26 20:01:25.460 183181 INFO nova.compute.manager [-] [instance: a68751cb-f30e-4bcd-a9d0-aaadca040a7c] Took 1.89 seconds to deallocate network for instance.
Jan 26 20:01:25 compute-0 nova_compute[183177]: 2026-01-26 20:01:25.986 183181 DEBUG oslo_concurrency.lockutils [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:25 compute-0 nova_compute[183177]: 2026-01-26 20:01:25.987 183181 DEBUG oslo_concurrency.lockutils [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:26 compute-0 nova_compute[183177]: 2026-01-26 20:01:26.075 183181 DEBUG nova.compute.provider_tree [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:01:26 compute-0 nova_compute[183177]: 2026-01-26 20:01:26.584 183181 DEBUG nova.scheduler.client.report [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:01:27 compute-0 nova_compute[183177]: 2026-01-26 20:01:27.233 183181 DEBUG oslo_concurrency.lockutils [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.246s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:27 compute-0 nova_compute[183177]: 2026-01-26 20:01:27.259 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:27 compute-0 nova_compute[183177]: 2026-01-26 20:01:27.266 183181 INFO nova.scheduler.client.report [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Deleted allocations for instance a68751cb-f30e-4bcd-a9d0-aaadca040a7c
Jan 26 20:01:28 compute-0 nova_compute[183177]: 2026-01-26 20:01:28.050 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:28 compute-0 nova_compute[183177]: 2026-01-26 20:01:28.297 183181 DEBUG oslo_concurrency.lockutils [None req-0f09abee-ea03-4b5e-a1ec-7bd96f6bbf6e 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "a68751cb-f30e-4bcd-a9d0-aaadca040a7c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.576s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:29 compute-0 podman[213111]: 2026-01-26 20:01:29.351337702 +0000 UTC m=+0.094046777 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-type=git, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 20:01:29 compute-0 podman[213112]: 2026-01-26 20:01:29.354104566 +0000 UTC m=+0.081226376 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Jan 26 20:01:29 compute-0 podman[213110]: 2026-01-26 20:01:29.428392416 +0000 UTC m=+0.167766363 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller)
Jan 26 20:01:29 compute-0 podman[192499]: time="2026-01-26T20:01:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:01:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:01:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:01:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:01:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Jan 26 20:01:31 compute-0 openstack_network_exporter[195363]: ERROR   20:01:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:01:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:01:31 compute-0 openstack_network_exporter[195363]: ERROR   20:01:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:01:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:01:32 compute-0 sshd-session[213175]: Connection closed by authenticating user root 142.93.140.142 port 44846 [preauth]
Jan 26 20:01:32 compute-0 nova_compute[183177]: 2026-01-26 20:01:32.261 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:33 compute-0 nova_compute[183177]: 2026-01-26 20:01:33.062 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:37 compute-0 nova_compute[183177]: 2026-01-26 20:01:37.295 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:37 compute-0 podman[213177]: 2026-01-26 20:01:37.35294088 +0000 UTC m=+0.094837549 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 20:01:38 compute-0 nova_compute[183177]: 2026-01-26 20:01:38.067 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:38 compute-0 nova_compute[183177]: 2026-01-26 20:01:38.783 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:38 compute-0 nova_compute[183177]: 2026-01-26 20:01:38.784 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:39 compute-0 nova_compute[183177]: 2026-01-26 20:01:39.288 183181 DEBUG nova.compute.manager [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 20:01:39 compute-0 nova_compute[183177]: 2026-01-26 20:01:39.841 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:39 compute-0 nova_compute[183177]: 2026-01-26 20:01:39.842 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:39 compute-0 nova_compute[183177]: 2026-01-26 20:01:39.850 183181 DEBUG nova.virt.hardware [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 20:01:39 compute-0 nova_compute[183177]: 2026-01-26 20:01:39.851 183181 INFO nova.compute.claims [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Claim successful on node compute-0.ctlplane.example.com
Jan 26 20:01:40 compute-0 nova_compute[183177]: 2026-01-26 20:01:40.916 183181 DEBUG nova.compute.provider_tree [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:01:41 compute-0 nova_compute[183177]: 2026-01-26 20:01:41.431 183181 DEBUG nova.scheduler.client.report [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:01:41 compute-0 nova_compute[183177]: 2026-01-26 20:01:41.940 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:41 compute-0 nova_compute[183177]: 2026-01-26 20:01:41.941 183181 DEBUG nova.compute.manager [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 20:01:42 compute-0 nova_compute[183177]: 2026-01-26 20:01:42.299 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:42 compute-0 nova_compute[183177]: 2026-01-26 20:01:42.454 183181 DEBUG nova.compute.manager [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 20:01:42 compute-0 nova_compute[183177]: 2026-01-26 20:01:42.454 183181 DEBUG nova.network.neutron [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 20:01:42 compute-0 nova_compute[183177]: 2026-01-26 20:01:42.454 183181 WARNING neutronclient.v2_0.client [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:01:42 compute-0 nova_compute[183177]: 2026-01-26 20:01:42.455 183181 WARNING neutronclient.v2_0.client [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:01:42 compute-0 nova_compute[183177]: 2026-01-26 20:01:42.962 183181 INFO nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 20:01:43 compute-0 nova_compute[183177]: 2026-01-26 20:01:43.055 183181 DEBUG nova.network.neutron [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Successfully created port: 73e3d3b5-4b31-402f-9d36-37f620d86504 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 20:01:43 compute-0 nova_compute[183177]: 2026-01-26 20:01:43.069 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:43 compute-0 nova_compute[183177]: 2026-01-26 20:01:43.479 183181 DEBUG nova.compute.manager [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.045 183181 DEBUG nova.network.neutron [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Successfully updated port: 73e3d3b5-4b31-402f-9d36-37f620d86504 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.134 183181 DEBUG nova.compute.manager [req-01e39009-b3cf-4362-8fc0-b1acda87ab83 req-b3d647c8-16b1-46c9-935a-79dc40639550 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-changed-73e3d3b5-4b31-402f-9d36-37f620d86504 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.135 183181 DEBUG nova.compute.manager [req-01e39009-b3cf-4362-8fc0-b1acda87ab83 req-b3d647c8-16b1-46c9-935a-79dc40639550 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Refreshing instance network info cache due to event network-changed-73e3d3b5-4b31-402f-9d36-37f620d86504. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.135 183181 DEBUG oslo_concurrency.lockutils [req-01e39009-b3cf-4362-8fc0-b1acda87ab83 req-b3d647c8-16b1-46c9-935a-79dc40639550 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-c1c1164e-2433-4d44-a423-73e2a317c3c1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.136 183181 DEBUG oslo_concurrency.lockutils [req-01e39009-b3cf-4362-8fc0-b1acda87ab83 req-b3d647c8-16b1-46c9-935a-79dc40639550 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-c1c1164e-2433-4d44-a423-73e2a317c3c1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.136 183181 DEBUG nova.network.neutron [req-01e39009-b3cf-4362-8fc0-b1acda87ab83 req-b3d647c8-16b1-46c9-935a-79dc40639550 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Refreshing network info cache for port 73e3d3b5-4b31-402f-9d36-37f620d86504 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.501 183181 DEBUG nova.compute.manager [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.503 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.504 183181 INFO nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Creating image(s)
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.505 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.505 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.507 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.508 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.515 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.518 183181 DEBUG oslo_concurrency.processutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.553 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "refresh_cache-c1c1164e-2433-4d44-a423-73e2a317c3c1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.614 183181 DEBUG oslo_concurrency.processutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.615 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.616 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.616 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.620 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.621 183181 DEBUG oslo_concurrency.processutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.645 183181 WARNING neutronclient.v2_0.client [req-01e39009-b3cf-4362-8fc0-b1acda87ab83 req-b3d647c8-16b1-46c9-935a-79dc40639550 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.690 183181 DEBUG oslo_concurrency.processutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.691 183181 DEBUG oslo_concurrency.processutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.732 183181 DEBUG oslo_concurrency.processutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.733 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.733 183181 DEBUG oslo_concurrency.processutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.798 183181 DEBUG oslo_concurrency.processutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.799 183181 DEBUG nova.virt.disk.api [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Checking if we can resize image /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.800 183181 DEBUG oslo_concurrency.processutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.852 183181 DEBUG oslo_concurrency.processutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.853 183181 DEBUG nova.virt.disk.api [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Cannot resize image /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.853 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.854 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Ensure instance console log exists: /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.854 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.855 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:44 compute-0 nova_compute[183177]: 2026-01-26 20:01:44.855 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:45 compute-0 nova_compute[183177]: 2026-01-26 20:01:45.153 183181 DEBUG nova.network.neutron [req-01e39009-b3cf-4362-8fc0-b1acda87ab83 req-b3d647c8-16b1-46c9-935a-79dc40639550 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:01:45 compute-0 sshd-session[213205]: Connection closed by authenticating user root 188.166.116.149 port 59532 [preauth]
Jan 26 20:01:45 compute-0 nova_compute[183177]: 2026-01-26 20:01:45.329 183181 DEBUG nova.network.neutron [req-01e39009-b3cf-4362-8fc0-b1acda87ab83 req-b3d647c8-16b1-46c9-935a-79dc40639550 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:01:45 compute-0 nova_compute[183177]: 2026-01-26 20:01:45.838 183181 DEBUG oslo_concurrency.lockutils [req-01e39009-b3cf-4362-8fc0-b1acda87ab83 req-b3d647c8-16b1-46c9-935a-79dc40639550 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-c1c1164e-2433-4d44-a423-73e2a317c3c1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:01:45 compute-0 nova_compute[183177]: 2026-01-26 20:01:45.839 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquired lock "refresh_cache-c1c1164e-2433-4d44-a423-73e2a317c3c1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:01:45 compute-0 nova_compute[183177]: 2026-01-26 20:01:45.840 183181 DEBUG nova.network.neutron [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 20:01:47 compute-0 nova_compute[183177]: 2026-01-26 20:01:47.029 183181 DEBUG nova.network.neutron [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:01:47 compute-0 nova_compute[183177]: 2026-01-26 20:01:47.300 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:47 compute-0 nova_compute[183177]: 2026-01-26 20:01:47.308 183181 WARNING neutronclient.v2_0.client [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.072 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.157 183181 DEBUG nova.network.neutron [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Updating instance_info_cache with network_info: [{"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.706 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Releasing lock "refresh_cache-c1c1164e-2433-4d44-a423-73e2a317c3c1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.706 183181 DEBUG nova.compute.manager [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Instance network_info: |[{"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.709 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Start _get_guest_xml network_info=[{"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.713 183181 WARNING nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.715 183181 DEBUG nova.virt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalanceStrategy-server-1320665426', uuid='c1c1164e-2433-4d44-a423-73e2a317c3c1'), owner=OwnerMeta(userid='7033feaa27a8427197df3725be3d1a7a', username='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin', projectid='3ab7d887b45a437cabdface06e8a9be1', projectname='tempest-TestExecuteWorkloadBalanceStrategy-1111771467'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='tempest-watcher_flavor-1757377249', flavorid='63d24b8a-64ab-43ac-be82-04ef5f355697', memory_mb=1151, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={}, swap=0), network_info=[{"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769457708.7149725) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.759 183181 DEBUG nova.virt.libvirt.host [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.760 183181 DEBUG nova.virt.libvirt.host [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.763 183181 DEBUG nova.virt.libvirt.host [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.763 183181 DEBUG nova.virt.libvirt.host [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.764 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.765 183181 DEBUG nova.virt.hardware [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T20:01:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='63d24b8a-64ab-43ac-be82-04ef5f355697',id=3,is_public=True,memory_mb=1151,name='tempest-watcher_flavor-1757377249',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.765 183181 DEBUG nova.virt.hardware [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.765 183181 DEBUG nova.virt.hardware [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.766 183181 DEBUG nova.virt.hardware [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.766 183181 DEBUG nova.virt.hardware [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.766 183181 DEBUG nova.virt.hardware [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.766 183181 DEBUG nova.virt.hardware [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.766 183181 DEBUG nova.virt.hardware [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.767 183181 DEBUG nova.virt.hardware [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.767 183181 DEBUG nova.virt.hardware [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.767 183181 DEBUG nova.virt.hardware [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.771 183181 DEBUG nova.virt.libvirt.vif [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:01:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1320665426',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1320665426',id=24,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-9i1e2zf5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:01:43Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=c1c1164e-2433-4d44-a423-73e2a317c3c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.771 183181 DEBUG nova.network.os_vif_util [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converting VIF {"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.772 183181 DEBUG nova.network.os_vif_util [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:59:1c,bridge_name='br-int',has_traffic_filtering=True,id=73e3d3b5-4b31-402f-9d36-37f620d86504,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e3d3b5-4b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:01:48 compute-0 nova_compute[183177]: 2026-01-26 20:01:48.773 183181 DEBUG nova.objects.instance [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lazy-loading 'pci_devices' on Instance uuid c1c1164e-2433-4d44-a423-73e2a317c3c1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.281 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] End _get_guest_xml xml=<domain type="kvm">
Jan 26 20:01:49 compute-0 nova_compute[183177]:   <uuid>c1c1164e-2433-4d44-a423-73e2a317c3c1</uuid>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   <name>instance-00000018</name>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   <memory>1178624</memory>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1320665426</nova:name>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:01:48</nova:creationTime>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <nova:flavor name="tempest-watcher_flavor-1757377249" id="63d24b8a-64ab-43ac-be82-04ef5f355697">
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:memory>1151</nova:memory>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:extraSpecs/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:01:49 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:user uuid="7033feaa27a8427197df3725be3d1a7a">tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin</nova:user>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:project uuid="3ab7d887b45a437cabdface06e8a9be1">tempest-TestExecuteWorkloadBalanceStrategy-1111771467</nova:project>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         <nova:port uuid="73e3d3b5-4b31-402f-9d36-37f620d86504">
Jan 26 20:01:49 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <system>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <entry name="serial">c1c1164e-2433-4d44-a423-73e2a317c3c1</entry>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <entry name="uuid">c1c1164e-2433-4d44-a423-73e2a317c3c1</entry>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     </system>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   <os>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   </os>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   <features>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   </features>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk.config"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:71:59:1c"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <target dev="tap73e3d3b5-4b"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     </interface>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/console.log" append="off"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <video>
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     </video>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:01:49 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:01:49 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:01:49 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:01:49 compute-0 nova_compute[183177]: </domain>
Jan 26 20:01:49 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.282 183181 DEBUG nova.compute.manager [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Preparing to wait for external event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.282 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.283 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.283 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.284 183181 DEBUG nova.virt.libvirt.vif [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:01:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1320665426',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1320665426',id=24,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-9i1e2zf5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:01:43Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=c1c1164e-2433-4d44-a423-73e2a317c3c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.285 183181 DEBUG nova.network.os_vif_util [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converting VIF {"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.286 183181 DEBUG nova.network.os_vif_util [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:59:1c,bridge_name='br-int',has_traffic_filtering=True,id=73e3d3b5-4b31-402f-9d36-37f620d86504,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e3d3b5-4b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.286 183181 DEBUG os_vif [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:59:1c,bridge_name='br-int',has_traffic_filtering=True,id=73e3d3b5-4b31-402f-9d36-37f620d86504,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e3d3b5-4b') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.287 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.288 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.288 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.289 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.290 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ee94df3e-2702-5b5c-a0f2-0b272af16168', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.291 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.293 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.296 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.297 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap73e3d3b5-4b, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.298 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap73e3d3b5-4b, col_values=(('qos', UUID('e0ad11fe-d2c4-4cc2-8f7e-8026a0f093e7')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.299 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap73e3d3b5-4b, col_values=(('external_ids', {'iface-id': '73e3d3b5-4b31-402f-9d36-37f620d86504', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:71:59:1c', 'vm-uuid': 'c1c1164e-2433-4d44-a423-73e2a317c3c1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.301 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:49 compute-0 NetworkManager[55489]: <info>  [1769457709.3023] manager: (tap73e3d3b5-4b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.304 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.307 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:49 compute-0 nova_compute[183177]: 2026-01-26 20:01:49.308 183181 INFO os_vif [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:59:1c,bridge_name='br-int',has_traffic_filtering=True,id=73e3d3b5-4b31-402f-9d36-37f620d86504,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e3d3b5-4b')
Jan 26 20:01:50 compute-0 nova_compute[183177]: 2026-01-26 20:01:50.871 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:01:50 compute-0 nova_compute[183177]: 2026-01-26 20:01:50.872 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:01:50 compute-0 nova_compute[183177]: 2026-01-26 20:01:50.872 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] No VIF found with MAC fa:16:3e:71:59:1c, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 20:01:50 compute-0 nova_compute[183177]: 2026-01-26 20:01:50.873 183181 INFO nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Using config drive
Jan 26 20:01:51 compute-0 nova_compute[183177]: 2026-01-26 20:01:51.388 183181 WARNING neutronclient.v2_0.client [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:01:51 compute-0 nova_compute[183177]: 2026-01-26 20:01:51.921 183181 INFO nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Creating config drive at /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk.config
Jan 26 20:01:51 compute-0 nova_compute[183177]: 2026-01-26 20:01:51.928 183181 DEBUG oslo_concurrency.processutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp4i3s8zev execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.058 183181 DEBUG oslo_concurrency.processutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp4i3s8zev" returned: 0 in 0.130s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:01:52 compute-0 kernel: tap73e3d3b5-4b: entered promiscuous mode
Jan 26 20:01:52 compute-0 NetworkManager[55489]: <info>  [1769457712.1213] manager: (tap73e3d3b5-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Jan 26 20:01:52 compute-0 systemd-udevd[213237]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:01:52 compute-0 ovn_controller[95396]: 2026-01-26T20:01:52Z|00181|binding|INFO|Claiming lport 73e3d3b5-4b31-402f-9d36-37f620d86504 for this chassis.
Jan 26 20:01:52 compute-0 ovn_controller[95396]: 2026-01-26T20:01:52Z|00182|binding|INFO|73e3d3b5-4b31-402f-9d36-37f620d86504: Claiming fa:16:3e:71:59:1c 10.100.0.7
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.152 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.161 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:59:1c 10.100.0.7'], port_security=['fa:16:3e:71:59:1c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c1c1164e-2433-4d44-a423-73e2a317c3c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ab7d887b45a437cabdface06e8a9be1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0cb4f9c1-bc93-4265-b41c-06f910025486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee375b04-bdd2-4f20-981a-bcd11f6bf341, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=73e3d3b5-4b31-402f-9d36-37f620d86504) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.162 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 73e3d3b5-4b31-402f-9d36-37f620d86504 in datapath d1220de0-89fa-4020-84a8-6d0a816a5b3c bound to our chassis
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.164 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1220de0-89fa-4020-84a8-6d0a816a5b3c
Jan 26 20:01:52 compute-0 ovn_controller[95396]: 2026-01-26T20:01:52Z|00183|binding|INFO|Setting lport 73e3d3b5-4b31-402f-9d36-37f620d86504 ovn-installed in OVS
Jan 26 20:01:52 compute-0 ovn_controller[95396]: 2026-01-26T20:01:52Z|00184|binding|INFO|Setting lport 73e3d3b5-4b31-402f-9d36-37f620d86504 up in Southbound
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.168 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:52 compute-0 NetworkManager[55489]: <info>  [1769457712.1730] device (tap73e3d3b5-4b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 20:01:52 compute-0 NetworkManager[55489]: <info>  [1769457712.1740] device (tap73e3d3b5-4b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.178 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[de6a058a-ffc2-451f-a4db-43e47bc39d21]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.179 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd1220de0-81 in ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.181 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd1220de0-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.181 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d73f9bfa-b819-47bf-b44e-68e833020878]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.182 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9d5e9921-62c4-47c7-b750-98ba609e129c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.196 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[97c5ab5b-e7ef-4fe3-abed-90fe7e458b82]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 systemd-machined[154465]: New machine qemu-17-instance-00000018.
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.212 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1284ec09-e582-42a2-9976-9d4b9edbd805]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000018.
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.251 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[c908a676-9382-4032-b18b-b9ba4e80a66e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.255 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[727dac69-ed47-4b9c-8665-ccd72e8fa64a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 NetworkManager[55489]: <info>  [1769457712.2577] manager: (tapd1220de0-80): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Jan 26 20:01:52 compute-0 systemd-udevd[213240]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.294 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[50395efb-e6a6-4cc7-bffe-6f2eb218e90f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.297 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[edc61c24-0ede-49f0-8560-28cca36810d9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.302 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:52 compute-0 NetworkManager[55489]: <info>  [1769457712.3341] device (tapd1220de0-80): carrier: link connected
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.342 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[c153bd69-b80f-4a87-a780-08c9fbb3cbba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.365 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d39d7ff7-62d7-48d5-930f-c1a7119d6fac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1220de0-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:e4:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517122, 'reachable_time': 25454, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213273, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.367 183181 DEBUG nova.compute.manager [req-97e0758b-0a13-48d2-b48b-f3bae2cd351a req-48e35101-88aa-4be7-89e4-f8abbea403ba 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.367 183181 DEBUG oslo_concurrency.lockutils [req-97e0758b-0a13-48d2-b48b-f3bae2cd351a req-48e35101-88aa-4be7-89e4-f8abbea403ba 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.368 183181 DEBUG oslo_concurrency.lockutils [req-97e0758b-0a13-48d2-b48b-f3bae2cd351a req-48e35101-88aa-4be7-89e4-f8abbea403ba 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.368 183181 DEBUG oslo_concurrency.lockutils [req-97e0758b-0a13-48d2-b48b-f3bae2cd351a req-48e35101-88aa-4be7-89e4-f8abbea403ba 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.368 183181 DEBUG nova.compute.manager [req-97e0758b-0a13-48d2-b48b-f3bae2cd351a req-48e35101-88aa-4be7-89e4-f8abbea403ba 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Processing event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.392 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca510fe-68a3-4605-b5ea-fbbbb72af239]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:e4a9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517122, 'tstamp': 517122}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213274, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.412 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b744d149-6ceb-41aa-894c-28aa8b2e4c94]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1220de0-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:e4:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517122, 'reachable_time': 25454, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213275, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.457 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d17a81de-6a6f-43b4-a75d-b7fe8f088618]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.553 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[37179ce0-b29d-4248-a3db-190f580485bc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.556 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1220de0-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.557 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.558 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1220de0-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:52 compute-0 kernel: tapd1220de0-80: entered promiscuous mode
Jan 26 20:01:52 compute-0 NetworkManager[55489]: <info>  [1769457712.5612] manager: (tapd1220de0-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.562 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.563 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1220de0-80, col_values=(('external_ids', {'iface-id': '7a5d2ac3-15cd-4ece-b5ef-f600fe36ec74'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.565 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:52 compute-0 ovn_controller[95396]: 2026-01-26T20:01:52Z|00185|binding|INFO|Releasing lport 7a5d2ac3-15cd-4ece-b5ef-f600fe36ec74 from this chassis (sb_readonly=0)
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.579 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.581 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8947838f-7685-4ec1-8821-79178f22ce00]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.582 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.583 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.583 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d1220de0-89fa-4020-84a8-6d0a816a5b3c disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.583 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.583 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6a00e2fd-f0a6-4b8d-9280-315aaa093c4f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.584 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.584 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[29d91465-ba59-47f3-8474-91b924c09218]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.585 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: global
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-d1220de0-89fa-4020-84a8-6d0a816a5b3c
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID d1220de0-89fa-4020-84a8-6d0a816a5b3c
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 20:01:52 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:01:52.586 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'env', 'PROCESS_TAG=haproxy-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d1220de0-89fa-4020-84a8-6d0a816a5b3c.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.607 183181 DEBUG nova.compute.manager [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.615 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.619 183181 INFO nova.virt.libvirt.driver [-] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Instance spawned successfully.
Jan 26 20:01:52 compute-0 nova_compute[183177]: 2026-01-26 20:01:52.620 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 20:01:53 compute-0 podman[213314]: 2026-01-26 20:01:53.057514978 +0000 UTC m=+0.079085898 container create fbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=watcher_latest)
Jan 26 20:01:53 compute-0 podman[213314]: 2026-01-26 20:01:53.012668263 +0000 UTC m=+0.034239243 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 20:01:53 compute-0 systemd[1]: Started libpod-conmon-fbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5.scope.
Jan 26 20:01:53 compute-0 nova_compute[183177]: 2026-01-26 20:01:53.134 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:01:53 compute-0 nova_compute[183177]: 2026-01-26 20:01:53.136 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:01:53 compute-0 nova_compute[183177]: 2026-01-26 20:01:53.136 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:01:53 compute-0 nova_compute[183177]: 2026-01-26 20:01:53.137 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:01:53 compute-0 nova_compute[183177]: 2026-01-26 20:01:53.138 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:01:53 compute-0 nova_compute[183177]: 2026-01-26 20:01:53.139 183181 DEBUG nova.virt.libvirt.driver [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:01:53 compute-0 systemd[1]: Started libcrun container.
Jan 26 20:01:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/050912dfac62a60f414d1233ac7b1167d762b59c831e88d1eb6cb9a39e2f5c7a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 20:01:53 compute-0 podman[213314]: 2026-01-26 20:01:53.187107123 +0000 UTC m=+0.208678103 container init fbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Jan 26 20:01:53 compute-0 podman[213314]: 2026-01-26 20:01:53.198748913 +0000 UTC m=+0.220319833 container start fbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 20:01:53 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[213329]: [NOTICE]   (213333) : New worker (213335) forked
Jan 26 20:01:53 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[213329]: [NOTICE]   (213333) : Loading success.
Jan 26 20:01:53 compute-0 nova_compute[183177]: 2026-01-26 20:01:53.649 183181 INFO nova.compute.manager [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Took 9.15 seconds to spawn the instance on the hypervisor.
Jan 26 20:01:53 compute-0 nova_compute[183177]: 2026-01-26 20:01:53.650 183181 DEBUG nova.compute.manager [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 20:01:54 compute-0 nova_compute[183177]: 2026-01-26 20:01:54.183 183181 INFO nova.compute.manager [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Took 14.39 seconds to build instance.
Jan 26 20:01:54 compute-0 nova_compute[183177]: 2026-01-26 20:01:54.303 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:54 compute-0 nova_compute[183177]: 2026-01-26 20:01:54.438 183181 DEBUG nova.compute.manager [req-e25644de-99a6-41af-8070-02ff1b0c3095 req-2055bbae-2b56-4221-a36b-c43197bf658d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:01:54 compute-0 nova_compute[183177]: 2026-01-26 20:01:54.439 183181 DEBUG oslo_concurrency.lockutils [req-e25644de-99a6-41af-8070-02ff1b0c3095 req-2055bbae-2b56-4221-a36b-c43197bf658d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:54 compute-0 nova_compute[183177]: 2026-01-26 20:01:54.440 183181 DEBUG oslo_concurrency.lockutils [req-e25644de-99a6-41af-8070-02ff1b0c3095 req-2055bbae-2b56-4221-a36b-c43197bf658d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:54 compute-0 nova_compute[183177]: 2026-01-26 20:01:54.440 183181 DEBUG oslo_concurrency.lockutils [req-e25644de-99a6-41af-8070-02ff1b0c3095 req-2055bbae-2b56-4221-a36b-c43197bf658d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:54 compute-0 nova_compute[183177]: 2026-01-26 20:01:54.441 183181 DEBUG nova.compute.manager [req-e25644de-99a6-41af-8070-02ff1b0c3095 req-2055bbae-2b56-4221-a36b-c43197bf658d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] No waiting events found dispatching network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:01:54 compute-0 nova_compute[183177]: 2026-01-26 20:01:54.441 183181 WARNING nova.compute.manager [req-e25644de-99a6-41af-8070-02ff1b0c3095 req-2055bbae-2b56-4221-a36b-c43197bf658d 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received unexpected event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 for instance with vm_state active and task_state None.
Jan 26 20:01:54 compute-0 nova_compute[183177]: 2026-01-26 20:01:54.689 183181 DEBUG oslo_concurrency.lockutils [None req-9e18d5a2-2c31-4254-8149-adc8b7113049 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.905s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:01:57 compute-0 nova_compute[183177]: 2026-01-26 20:01:57.344 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:58 compute-0 nova_compute[183177]: 2026-01-26 20:01:58.155 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:01:59 compute-0 nova_compute[183177]: 2026-01-26 20:01:59.307 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:01:59 compute-0 nova_compute[183177]: 2026-01-26 20:01:59.606 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "e905a08b-37ed-4341-8762-bf8b43688cc5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:01:59 compute-0 nova_compute[183177]: 2026-01-26 20:01:59.607 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:01:59 compute-0 podman[192499]: time="2026-01-26T20:01:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:01:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:01:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:01:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:01:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Jan 26 20:02:00 compute-0 nova_compute[183177]: 2026-01-26 20:02:00.113 183181 DEBUG nova.compute.manager [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 20:02:00 compute-0 podman[213345]: 2026-01-26 20:02:00.361064381 +0000 UTC m=+0.082552560 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, io.openshift.expose-services=, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., architecture=x86_64, vendor=Red Hat, Inc.)
Jan 26 20:02:00 compute-0 podman[213346]: 2026-01-26 20:02:00.370468732 +0000 UTC m=+0.095545267 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:02:00 compute-0 podman[213344]: 2026-01-26 20:02:00.415273086 +0000 UTC m=+0.149289489 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 20:02:00 compute-0 nova_compute[183177]: 2026-01-26 20:02:00.675 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:00 compute-0 nova_compute[183177]: 2026-01-26 20:02:00.677 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:00 compute-0 nova_compute[183177]: 2026-01-26 20:02:00.689 183181 DEBUG nova.virt.hardware [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 20:02:00 compute-0 nova_compute[183177]: 2026-01-26 20:02:00.690 183181 INFO nova.compute.claims [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Claim successful on node compute-0.ctlplane.example.com
Jan 26 20:02:01 compute-0 openstack_network_exporter[195363]: ERROR   20:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:02:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:02:01 compute-0 openstack_network_exporter[195363]: ERROR   20:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:02:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:02:01 compute-0 nova_compute[183177]: 2026-01-26 20:02:01.771 183181 DEBUG nova.compute.provider_tree [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:02:01 compute-0 anacron[30974]: Job `cron.weekly' started
Jan 26 20:02:01 compute-0 anacron[30974]: Job `cron.weekly' terminated
Jan 26 20:02:02 compute-0 nova_compute[183177]: 2026-01-26 20:02:02.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:02:02 compute-0 nova_compute[183177]: 2026-01-26 20:02:02.286 183181 DEBUG nova.scheduler.client.report [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:02:02 compute-0 nova_compute[183177]: 2026-01-26 20:02:02.347 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:02 compute-0 nova_compute[183177]: 2026-01-26 20:02:02.798 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.121s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:02 compute-0 nova_compute[183177]: 2026-01-26 20:02:02.799 183181 DEBUG nova.compute.manager [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 20:02:03 compute-0 nova_compute[183177]: 2026-01-26 20:02:03.311 183181 DEBUG nova.compute.manager [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 20:02:03 compute-0 nova_compute[183177]: 2026-01-26 20:02:03.311 183181 DEBUG nova.network.neutron [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 20:02:03 compute-0 nova_compute[183177]: 2026-01-26 20:02:03.312 183181 WARNING neutronclient.v2_0.client [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:02:03 compute-0 nova_compute[183177]: 2026-01-26 20:02:03.312 183181 WARNING neutronclient.v2_0.client [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:02:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:03.793 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:02:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:03.819 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:02:03 compute-0 nova_compute[183177]: 2026-01-26 20:02:03.820 183181 INFO nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 20:02:03 compute-0 nova_compute[183177]: 2026-01-26 20:02:03.826 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:03 compute-0 nova_compute[183177]: 2026-01-26 20:02:03.948 183181 DEBUG nova.network.neutron [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Successfully created port: f5880f43-5845-4174-b639-e51cab7c6f2f _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 20:02:04 compute-0 nova_compute[183177]: 2026-01-26 20:02:04.310 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:04 compute-0 nova_compute[183177]: 2026-01-26 20:02:04.336 183181 DEBUG nova.compute.manager [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 20:02:05 compute-0 ovn_controller[95396]: 2026-01-26T20:02:05Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:71:59:1c 10.100.0.7
Jan 26 20:02:05 compute-0 ovn_controller[95396]: 2026-01-26T20:02:05Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:71:59:1c 10.100.0.7
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.359 183181 DEBUG nova.compute.manager [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.361 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.361 183181 INFO nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Creating image(s)
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.362 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "/var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.363 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "/var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.364 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "/var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.365 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.372 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.375 183181 DEBUG oslo_concurrency.processutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.466 183181 DEBUG oslo_concurrency.processutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.468 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.469 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.470 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.476 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.477 183181 DEBUG oslo_concurrency.processutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.564 183181 DEBUG oslo_concurrency.processutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.566 183181 DEBUG oslo_concurrency.processutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.607 183181 DEBUG oslo_concurrency.processutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.608 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.139s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.609 183181 DEBUG oslo_concurrency.processutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.660 183181 DEBUG oslo_concurrency.processutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.662 183181 DEBUG nova.virt.disk.api [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Checking if we can resize image /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.662 183181 DEBUG oslo_concurrency.processutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.732 183181 DEBUG oslo_concurrency.processutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.733 183181 DEBUG nova.virt.disk.api [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Cannot resize image /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.734 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.734 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Ensure instance console log exists: /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.735 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.735 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:05 compute-0 nova_compute[183177]: 2026-01-26 20:02:05.735 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:05.820 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:06 compute-0 nova_compute[183177]: 2026-01-26 20:02:06.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:02:06 compute-0 nova_compute[183177]: 2026-01-26 20:02:06.210 183181 DEBUG nova.network.neutron [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Successfully updated port: f5880f43-5845-4174-b639-e51cab7c6f2f _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 20:02:06 compute-0 nova_compute[183177]: 2026-01-26 20:02:06.269 183181 DEBUG nova.compute.manager [req-60b93b38-15c5-4e4c-a514-1fa5412f9bd4 req-9f59f6fe-7261-4f5d-8ee0-f239059f4918 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Received event network-changed-f5880f43-5845-4174-b639-e51cab7c6f2f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:02:06 compute-0 nova_compute[183177]: 2026-01-26 20:02:06.269 183181 DEBUG nova.compute.manager [req-60b93b38-15c5-4e4c-a514-1fa5412f9bd4 req-9f59f6fe-7261-4f5d-8ee0-f239059f4918 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Refreshing instance network info cache due to event network-changed-f5880f43-5845-4174-b639-e51cab7c6f2f. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:02:06 compute-0 nova_compute[183177]: 2026-01-26 20:02:06.270 183181 DEBUG oslo_concurrency.lockutils [req-60b93b38-15c5-4e4c-a514-1fa5412f9bd4 req-9f59f6fe-7261-4f5d-8ee0-f239059f4918 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-e905a08b-37ed-4341-8762-bf8b43688cc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:02:06 compute-0 nova_compute[183177]: 2026-01-26 20:02:06.270 183181 DEBUG oslo_concurrency.lockutils [req-60b93b38-15c5-4e4c-a514-1fa5412f9bd4 req-9f59f6fe-7261-4f5d-8ee0-f239059f4918 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-e905a08b-37ed-4341-8762-bf8b43688cc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:02:06 compute-0 nova_compute[183177]: 2026-01-26 20:02:06.270 183181 DEBUG nova.network.neutron [req-60b93b38-15c5-4e4c-a514-1fa5412f9bd4 req-9f59f6fe-7261-4f5d-8ee0-f239059f4918 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Refreshing network info cache for port f5880f43-5845-4174-b639-e51cab7c6f2f _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:02:06 compute-0 nova_compute[183177]: 2026-01-26 20:02:06.688 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:06 compute-0 nova_compute[183177]: 2026-01-26 20:02:06.689 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:06 compute-0 nova_compute[183177]: 2026-01-26 20:02:06.689 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:06 compute-0 nova_compute[183177]: 2026-01-26 20:02:06.689 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:02:06 compute-0 nova_compute[183177]: 2026-01-26 20:02:06.715 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "refresh_cache-e905a08b-37ed-4341-8762-bf8b43688cc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:02:06 compute-0 nova_compute[183177]: 2026-01-26 20:02:06.777 183181 WARNING neutronclient.v2_0.client [req-60b93b38-15c5-4e4c-a514-1fa5412f9bd4 req-9f59f6fe-7261-4f5d-8ee0-f239059f4918 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:02:07 compute-0 nova_compute[183177]: 2026-01-26 20:02:07.025 183181 DEBUG nova.network.neutron [req-60b93b38-15c5-4e4c-a514-1fa5412f9bd4 req-9f59f6fe-7261-4f5d-8ee0-f239059f4918 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:02:07 compute-0 nova_compute[183177]: 2026-01-26 20:02:07.200 183181 DEBUG nova.network.neutron [req-60b93b38-15c5-4e4c-a514-1fa5412f9bd4 req-9f59f6fe-7261-4f5d-8ee0-f239059f4918 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:02:07 compute-0 nova_compute[183177]: 2026-01-26 20:02:07.386 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:07 compute-0 nova_compute[183177]: 2026-01-26 20:02:07.710 183181 DEBUG oslo_concurrency.lockutils [req-60b93b38-15c5-4e4c-a514-1fa5412f9bd4 req-9f59f6fe-7261-4f5d-8ee0-f239059f4918 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-e905a08b-37ed-4341-8762-bf8b43688cc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:02:07 compute-0 nova_compute[183177]: 2026-01-26 20:02:07.711 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquired lock "refresh_cache-e905a08b-37ed-4341-8762-bf8b43688cc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:02:07 compute-0 nova_compute[183177]: 2026-01-26 20:02:07.711 183181 DEBUG nova.network.neutron [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 20:02:07 compute-0 nova_compute[183177]: 2026-01-26 20:02:07.742 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:02:07 compute-0 nova_compute[183177]: 2026-01-26 20:02:07.832 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:02:07 compute-0 nova_compute[183177]: 2026-01-26 20:02:07.833 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:02:07 compute-0 nova_compute[183177]: 2026-01-26 20:02:07.904 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:02:08 compute-0 nova_compute[183177]: 2026-01-26 20:02:08.141 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:02:08 compute-0 nova_compute[183177]: 2026-01-26 20:02:08.144 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:02:08 compute-0 nova_compute[183177]: 2026-01-26 20:02:08.182 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:02:08 compute-0 nova_compute[183177]: 2026-01-26 20:02:08.183 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5453MB free_disk=73.06919479370117GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:02:08 compute-0 nova_compute[183177]: 2026-01-26 20:02:08.184 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:08 compute-0 nova_compute[183177]: 2026-01-26 20:02:08.184 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:08 compute-0 podman[213446]: 2026-01-26 20:02:08.316824548 +0000 UTC m=+0.063044491 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 20:02:08 compute-0 nova_compute[183177]: 2026-01-26 20:02:08.505 183181 DEBUG nova.network.neutron [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:02:08 compute-0 nova_compute[183177]: 2026-01-26 20:02:08.724 183181 WARNING neutronclient.v2_0.client [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:02:08 compute-0 nova_compute[183177]: 2026-01-26 20:02:08.939 183181 DEBUG nova.network.neutron [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Updating instance_info_cache with network_info: [{"id": "f5880f43-5845-4174-b639-e51cab7c6f2f", "address": "fa:16:3e:7e:ef:83", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf5880f43-58", "ovs_interfaceid": "f5880f43-5845-4174-b639-e51cab7c6f2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.249 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance c1c1164e-2433-4d44-a423-73e2a317c3c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.250 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance e905a08b-37ed-4341-8762-bf8b43688cc5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.250 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.251 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=2814MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:02:08 up  1:26,  0 user,  load average: 0.59, 0.31, 0.29\n', 'num_instances': '2', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_3ab7d887b45a437cabdface06e8a9be1': '2', 'io_workload': '1', 'num_vm_building': '1', 'num_task_spawning': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.276 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing inventories for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.300 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating ProviderTree inventory for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.301 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.315 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.326 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing aggregate associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.362 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing trait associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_SCSI,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.438 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.447 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Releasing lock "refresh_cache-e905a08b-37ed-4341-8762-bf8b43688cc5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.448 183181 DEBUG nova.compute.manager [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Instance network_info: |[{"id": "f5880f43-5845-4174-b639-e51cab7c6f2f", "address": "fa:16:3e:7e:ef:83", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf5880f43-58", "ovs_interfaceid": "f5880f43-5845-4174-b639-e51cab7c6f2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.452 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Start _get_guest_xml network_info=[{"id": "f5880f43-5845-4174-b639-e51cab7c6f2f", "address": "fa:16:3e:7e:ef:83", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf5880f43-58", "ovs_interfaceid": "f5880f43-5845-4174-b639-e51cab7c6f2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.458 183181 WARNING nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.460 183181 DEBUG nova.virt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalanceStrategy-server-2125674647', uuid='e905a08b-37ed-4341-8762-bf8b43688cc5'), owner=OwnerMeta(userid='7033feaa27a8427197df3725be3d1a7a', username='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin', projectid='3ab7d887b45a437cabdface06e8a9be1', projectname='tempest-TestExecuteWorkloadBalanceStrategy-1111771467'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='tempest-watcher_flavor-1757377249', flavorid='63d24b8a-64ab-43ac-be82-04ef5f355697', memory_mb=1151, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={}, swap=0), network_info=[{"id": "f5880f43-5845-4174-b639-e51cab7c6f2f", "address": "fa:16:3e:7e:ef:83", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf5880f43-58", "ovs_interfaceid": "f5880f43-5845-4174-b639-e51cab7c6f2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769457729.4605558) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.465 183181 DEBUG nova.virt.libvirt.host [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.466 183181 DEBUG nova.virt.libvirt.host [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.469 183181 DEBUG nova.virt.libvirt.host [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.470 183181 DEBUG nova.virt.libvirt.host [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.472 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.472 183181 DEBUG nova.virt.hardware [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T20:01:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='63d24b8a-64ab-43ac-be82-04ef5f355697',id=3,is_public=True,memory_mb=1151,name='tempest-watcher_flavor-1757377249',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.473 183181 DEBUG nova.virt.hardware [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.474 183181 DEBUG nova.virt.hardware [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.474 183181 DEBUG nova.virt.hardware [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.475 183181 DEBUG nova.virt.hardware [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.475 183181 DEBUG nova.virt.hardware [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.475 183181 DEBUG nova.virt.hardware [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.476 183181 DEBUG nova.virt.hardware [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.476 183181 DEBUG nova.virt.hardware [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.477 183181 DEBUG nova.virt.hardware [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.477 183181 DEBUG nova.virt.hardware [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.484 183181 DEBUG nova.virt.libvirt.vif [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:01:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-2125674647',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-2125674647',id=25,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-vcs629i0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:02:04Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=e905a08b-37ed-4341-8762-bf8b43688cc5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f5880f43-5845-4174-b639-e51cab7c6f2f", "address": "fa:16:3e:7e:ef:83", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf5880f43-58", "ovs_interfaceid": "f5880f43-5845-4174-b639-e51cab7c6f2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.484 183181 DEBUG nova.network.os_vif_util [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converting VIF {"id": "f5880f43-5845-4174-b639-e51cab7c6f2f", "address": "fa:16:3e:7e:ef:83", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf5880f43-58", "ovs_interfaceid": "f5880f43-5845-4174-b639-e51cab7c6f2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.486 183181 DEBUG nova.network.os_vif_util [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:ef:83,bridge_name='br-int',has_traffic_filtering=True,id=f5880f43-5845-4174-b639-e51cab7c6f2f,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf5880f43-58') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.487 183181 DEBUG nova.objects.instance [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lazy-loading 'pci_devices' on Instance uuid e905a08b-37ed-4341-8762-bf8b43688cc5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.947 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:02:09 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.997 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] End _get_guest_xml xml=<domain type="kvm">
Jan 26 20:02:09 compute-0 nova_compute[183177]:   <uuid>e905a08b-37ed-4341-8762-bf8b43688cc5</uuid>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   <name>instance-00000019</name>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   <memory>1178624</memory>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-2125674647</nova:name>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:02:09</nova:creationTime>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <nova:flavor name="tempest-watcher_flavor-1757377249" id="63d24b8a-64ab-43ac-be82-04ef5f355697">
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:memory>1151</nova:memory>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:extraSpecs/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:02:09 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:user uuid="7033feaa27a8427197df3725be3d1a7a">tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin</nova:user>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:project uuid="3ab7d887b45a437cabdface06e8a9be1">tempest-TestExecuteWorkloadBalanceStrategy-1111771467</nova:project>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         <nova:port uuid="f5880f43-5845-4174-b639-e51cab7c6f2f">
Jan 26 20:02:09 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <system>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <entry name="serial">e905a08b-37ed-4341-8762-bf8b43688cc5</entry>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <entry name="uuid">e905a08b-37ed-4341-8762-bf8b43688cc5</entry>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     </system>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   <os>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   </os>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   <features>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   </features>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk.config"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:7e:ef:83"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <target dev="tapf5880f43-58"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     </interface>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/console.log" append="off"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <video>
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     </video>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:02:09 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:02:09 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:02:09 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:02:10 compute-0 nova_compute[183177]: </domain>
Jan 26 20:02:10 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:09.999 183181 DEBUG nova.compute.manager [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Preparing to wait for external event network-vif-plugged-f5880f43-5845-4174-b639-e51cab7c6f2f prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.000 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.000 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.001 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.002 183181 DEBUG nova.virt.libvirt.vif [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:01:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-2125674647',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-2125674647',id=25,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-vcs629i0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:02:04Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=e905a08b-37ed-4341-8762-bf8b43688cc5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f5880f43-5845-4174-b639-e51cab7c6f2f", "address": "fa:16:3e:7e:ef:83", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf5880f43-58", "ovs_interfaceid": "f5880f43-5845-4174-b639-e51cab7c6f2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.002 183181 DEBUG nova.network.os_vif_util [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converting VIF {"id": "f5880f43-5845-4174-b639-e51cab7c6f2f", "address": "fa:16:3e:7e:ef:83", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf5880f43-58", "ovs_interfaceid": "f5880f43-5845-4174-b639-e51cab7c6f2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.003 183181 DEBUG nova.network.os_vif_util [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:ef:83,bridge_name='br-int',has_traffic_filtering=True,id=f5880f43-5845-4174-b639-e51cab7c6f2f,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf5880f43-58') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.004 183181 DEBUG os_vif [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:ef:83,bridge_name='br-int',has_traffic_filtering=True,id=f5880f43-5845-4174-b639-e51cab7c6f2f,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf5880f43-58') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.005 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.005 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.006 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.007 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.008 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '2e34eb63-62f6-5ab9-8a72-4b52b0f442a6', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.009 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.011 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.016 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.016 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5880f43-58, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.017 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf5880f43-58, col_values=(('qos', UUID('b0f7ffdf-7745-4654-8a14-613696967cd1')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.017 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf5880f43-58, col_values=(('external_ids', {'iface-id': 'f5880f43-5845-4174-b639-e51cab7c6f2f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7e:ef:83', 'vm-uuid': 'e905a08b-37ed-4341-8762-bf8b43688cc5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.018 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:10 compute-0 NetworkManager[55489]: <info>  [1769457730.0214] manager: (tapf5880f43-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.021 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.026 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.027 183181 INFO os_vif [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:ef:83,bridge_name='br-int',has_traffic_filtering=True,id=f5880f43-5845-4174-b639-e51cab7c6f2f,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf5880f43-58')
Jan 26 20:02:10 compute-0 sshd-session[213470]: Connection closed by authenticating user root 142.93.140.142 port 43644 [preauth]
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.595 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:02:10 compute-0 nova_compute[183177]: 2026-01-26 20:02:10.595 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.411s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:11 compute-0 nova_compute[183177]: 2026-01-26 20:02:11.591 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:02:11 compute-0 nova_compute[183177]: 2026-01-26 20:02:11.592 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:02:11 compute-0 nova_compute[183177]: 2026-01-26 20:02:11.592 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:02:11 compute-0 nova_compute[183177]: 2026-01-26 20:02:11.592 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:02:11 compute-0 nova_compute[183177]: 2026-01-26 20:02:11.631 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:02:11 compute-0 nova_compute[183177]: 2026-01-26 20:02:11.632 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:02:11 compute-0 nova_compute[183177]: 2026-01-26 20:02:11.632 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] No VIF found with MAC fa:16:3e:7e:ef:83, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 20:02:11 compute-0 nova_compute[183177]: 2026-01-26 20:02:11.632 183181 INFO nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Using config drive
Jan 26 20:02:12 compute-0 nova_compute[183177]: 2026-01-26 20:02:12.143 183181 WARNING neutronclient.v2_0.client [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:02:12 compute-0 nova_compute[183177]: 2026-01-26 20:02:12.375 183181 INFO nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Creating config drive at /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk.config
Jan 26 20:02:12 compute-0 nova_compute[183177]: 2026-01-26 20:02:12.379 183181 DEBUG oslo_concurrency.processutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmps0zq1ax3 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:02:12 compute-0 nova_compute[183177]: 2026-01-26 20:02:12.388 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:12 compute-0 nova_compute[183177]: 2026-01-26 20:02:12.511 183181 DEBUG oslo_concurrency.processutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmps0zq1ax3" returned: 0 in 0.131s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:02:12 compute-0 kernel: tapf5880f43-58: entered promiscuous mode
Jan 26 20:02:12 compute-0 NetworkManager[55489]: <info>  [1769457732.6107] manager: (tapf5880f43-58): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Jan 26 20:02:12 compute-0 nova_compute[183177]: 2026-01-26 20:02:12.611 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:12 compute-0 ovn_controller[95396]: 2026-01-26T20:02:12Z|00186|binding|INFO|Claiming lport f5880f43-5845-4174-b639-e51cab7c6f2f for this chassis.
Jan 26 20:02:12 compute-0 ovn_controller[95396]: 2026-01-26T20:02:12Z|00187|binding|INFO|f5880f43-5845-4174-b639-e51cab7c6f2f: Claiming fa:16:3e:7e:ef:83 10.100.0.3
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.620 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:ef:83 10.100.0.3'], port_security=['fa:16:3e:7e:ef:83 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e905a08b-37ed-4341-8762-bf8b43688cc5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ab7d887b45a437cabdface06e8a9be1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0cb4f9c1-bc93-4265-b41c-06f910025486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee375b04-bdd2-4f20-981a-bcd11f6bf341, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=f5880f43-5845-4174-b639-e51cab7c6f2f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.621 104672 INFO neutron.agent.ovn.metadata.agent [-] Port f5880f43-5845-4174-b639-e51cab7c6f2f in datapath d1220de0-89fa-4020-84a8-6d0a816a5b3c bound to our chassis
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.622 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1220de0-89fa-4020-84a8-6d0a816a5b3c
Jan 26 20:02:12 compute-0 ovn_controller[95396]: 2026-01-26T20:02:12Z|00188|binding|INFO|Setting lport f5880f43-5845-4174-b639-e51cab7c6f2f ovn-installed in OVS
Jan 26 20:02:12 compute-0 ovn_controller[95396]: 2026-01-26T20:02:12Z|00189|binding|INFO|Setting lport f5880f43-5845-4174-b639-e51cab7c6f2f up in Southbound
Jan 26 20:02:12 compute-0 nova_compute[183177]: 2026-01-26 20:02:12.633 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:12 compute-0 nova_compute[183177]: 2026-01-26 20:02:12.637 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:12 compute-0 systemd-udevd[213490]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.644 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[50caf829-9686-40c9-900b-ae38ca05f17b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:12 compute-0 systemd-machined[154465]: New machine qemu-18-instance-00000019.
Jan 26 20:02:12 compute-0 NetworkManager[55489]: <info>  [1769457732.6621] device (tapf5880f43-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 20:02:12 compute-0 NetworkManager[55489]: <info>  [1769457732.6626] device (tapf5880f43-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 20:02:12 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000019.
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.674 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[82eb21fd-af4a-41bc-9b25-161d0bcd4bab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.676 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[cacd7d01-cc29-44c2-bf00-65dc3a654184]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.708 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[97b0b7ec-fe3f-4a13-a129-4d7e9c652f62]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.729 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[deb971bd-c256-470f-be8c-d6b1c8ab5c05]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1220de0-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:e4:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517122, 'reachable_time': 25454, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213503, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.749 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2aded524-2ba1-4e20-a5de-47bfae00e391]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd1220de0-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517138, 'tstamp': 517138}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213506, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd1220de0-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517143, 'tstamp': 517143}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213506, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.750 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1220de0-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:12 compute-0 nova_compute[183177]: 2026-01-26 20:02:12.752 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:12 compute-0 nova_compute[183177]: 2026-01-26 20:02:12.753 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.753 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1220de0-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.753 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.753 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1220de0-80, col_values=(('external_ids', {'iface-id': '7a5d2ac3-15cd-4ece-b5ef-f600fe36ec74'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.754 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:02:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:12.755 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[83faafcf-b3b6-4fbe-bb4c-5db7102b092a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d1220de0-89fa-4020-84a8-6d0a816a5b3c\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d1220de0-89fa-4020-84a8-6d0a816a5b3c\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.367 183181 DEBUG nova.compute.manager [req-7e636587-956f-41ba-8c23-7496eed436c8 req-7044f134-effa-4bd2-bca8-5226cde0ba2b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Received event network-vif-plugged-f5880f43-5845-4174-b639-e51cab7c6f2f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.368 183181 DEBUG oslo_concurrency.lockutils [req-7e636587-956f-41ba-8c23-7496eed436c8 req-7044f134-effa-4bd2-bca8-5226cde0ba2b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.368 183181 DEBUG oslo_concurrency.lockutils [req-7e636587-956f-41ba-8c23-7496eed436c8 req-7044f134-effa-4bd2-bca8-5226cde0ba2b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.369 183181 DEBUG oslo_concurrency.lockutils [req-7e636587-956f-41ba-8c23-7496eed436c8 req-7044f134-effa-4bd2-bca8-5226cde0ba2b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.369 183181 DEBUG nova.compute.manager [req-7e636587-956f-41ba-8c23-7496eed436c8 req-7044f134-effa-4bd2-bca8-5226cde0ba2b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Processing event network-vif-plugged-f5880f43-5845-4174-b639-e51cab7c6f2f _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.370 183181 DEBUG nova.compute.manager [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.376 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.381 183181 INFO nova.virt.libvirt.driver [-] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Instance spawned successfully.
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.382 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.897 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.897 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.898 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.898 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.899 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:02:13 compute-0 nova_compute[183177]: 2026-01-26 20:02:13.899 183181 DEBUG nova.virt.libvirt.driver [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:02:14 compute-0 nova_compute[183177]: 2026-01-26 20:02:14.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:02:14 compute-0 nova_compute[183177]: 2026-01-26 20:02:14.409 183181 INFO nova.compute.manager [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Took 9.05 seconds to spawn the instance on the hypervisor.
Jan 26 20:02:14 compute-0 nova_compute[183177]: 2026-01-26 20:02:14.410 183181 DEBUG nova.compute.manager [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 20:02:14 compute-0 nova_compute[183177]: 2026-01-26 20:02:14.948 183181 INFO nova.compute.manager [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Took 14.33 seconds to build instance.
Jan 26 20:02:15 compute-0 nova_compute[183177]: 2026-01-26 20:02:15.050 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:15 compute-0 nova_compute[183177]: 2026-01-26 20:02:15.455 183181 DEBUG oslo_concurrency.lockutils [None req-7c86101f-66a2-48f4-b85b-b2fce8b8336b 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.848s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:15 compute-0 nova_compute[183177]: 2026-01-26 20:02:15.528 183181 DEBUG nova.compute.manager [req-eeac4087-6e87-43e5-b6e6-4a618ccbba63 req-30992028-dbde-4ff0-8d58-210b72f0f7b5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Received event network-vif-plugged-f5880f43-5845-4174-b639-e51cab7c6f2f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:02:15 compute-0 nova_compute[183177]: 2026-01-26 20:02:15.528 183181 DEBUG oslo_concurrency.lockutils [req-eeac4087-6e87-43e5-b6e6-4a618ccbba63 req-30992028-dbde-4ff0-8d58-210b72f0f7b5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:15 compute-0 nova_compute[183177]: 2026-01-26 20:02:15.528 183181 DEBUG oslo_concurrency.lockutils [req-eeac4087-6e87-43e5-b6e6-4a618ccbba63 req-30992028-dbde-4ff0-8d58-210b72f0f7b5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:15 compute-0 nova_compute[183177]: 2026-01-26 20:02:15.528 183181 DEBUG oslo_concurrency.lockutils [req-eeac4087-6e87-43e5-b6e6-4a618ccbba63 req-30992028-dbde-4ff0-8d58-210b72f0f7b5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:15 compute-0 nova_compute[183177]: 2026-01-26 20:02:15.529 183181 DEBUG nova.compute.manager [req-eeac4087-6e87-43e5-b6e6-4a618ccbba63 req-30992028-dbde-4ff0-8d58-210b72f0f7b5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] No waiting events found dispatching network-vif-plugged-f5880f43-5845-4174-b639-e51cab7c6f2f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:02:15 compute-0 nova_compute[183177]: 2026-01-26 20:02:15.529 183181 WARNING nova.compute.manager [req-eeac4087-6e87-43e5-b6e6-4a618ccbba63 req-30992028-dbde-4ff0-8d58-210b72f0f7b5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Received unexpected event network-vif-plugged-f5880f43-5845-4174-b639-e51cab7c6f2f for instance with vm_state active and task_state None.
Jan 26 20:02:17 compute-0 nova_compute[183177]: 2026-01-26 20:02:17.419 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:20 compute-0 nova_compute[183177]: 2026-01-26 20:02:20.052 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:22 compute-0 nova_compute[183177]: 2026-01-26 20:02:22.422 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:24.085 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:24.086 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:24.087 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:24 compute-0 nova_compute[183177]: 2026-01-26 20:02:24.148 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:02:25 compute-0 nova_compute[183177]: 2026-01-26 20:02:25.056 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:25 compute-0 sshd-session[213519]: Connection closed by authenticating user root 188.166.116.149 port 58400 [preauth]
Jan 26 20:02:25 compute-0 ovn_controller[95396]: 2026-01-26T20:02:25Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7e:ef:83 10.100.0.3
Jan 26 20:02:25 compute-0 ovn_controller[95396]: 2026-01-26T20:02:25Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7e:ef:83 10.100.0.3
Jan 26 20:02:27 compute-0 nova_compute[183177]: 2026-01-26 20:02:27.471 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:29 compute-0 podman[192499]: time="2026-01-26T20:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:02:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:02:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:02:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2643 "" "Go-http-client/1.1"
Jan 26 20:02:30 compute-0 nova_compute[183177]: 2026-01-26 20:02:30.056 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:31 compute-0 openstack_network_exporter[195363]: ERROR   20:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:02:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:02:31 compute-0 openstack_network_exporter[195363]: ERROR   20:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:02:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:02:31 compute-0 podman[213533]: 2026-01-26 20:02:31.634628613 +0000 UTC m=+0.067858139 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, container_name=openstack_network_exporter)
Jan 26 20:02:31 compute-0 podman[213534]: 2026-01-26 20:02:31.653009684 +0000 UTC m=+0.084263648 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120)
Jan 26 20:02:31 compute-0 podman[213532]: 2026-01-26 20:02:31.659964469 +0000 UTC m=+0.092406454 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 20:02:32 compute-0 nova_compute[183177]: 2026-01-26 20:02:32.508 183181 DEBUG nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Check if temp file /var/lib/nova/instances/tmpxsvnanhm exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 20:02:32 compute-0 nova_compute[183177]: 2026-01-26 20:02:32.514 183181 DEBUG nova.compute.manager [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxsvnanhm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1c1164e-2433-4d44-a423-73e2a317c3c1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 20:02:32 compute-0 nova_compute[183177]: 2026-01-26 20:02:32.516 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:35 compute-0 nova_compute[183177]: 2026-01-26 20:02:35.059 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:37 compute-0 nova_compute[183177]: 2026-01-26 20:02:37.498 183181 DEBUG oslo_concurrency.processutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:02:37 compute-0 nova_compute[183177]: 2026-01-26 20:02:37.567 183181 DEBUG oslo_concurrency.processutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:02:37 compute-0 nova_compute[183177]: 2026-01-26 20:02:37.568 183181 DEBUG oslo_concurrency.processutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:02:37 compute-0 nova_compute[183177]: 2026-01-26 20:02:37.576 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:37 compute-0 nova_compute[183177]: 2026-01-26 20:02:37.620 183181 DEBUG oslo_concurrency.processutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:02:37 compute-0 nova_compute[183177]: 2026-01-26 20:02:37.621 183181 DEBUG nova.compute.manager [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Preparing to wait for external event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:02:37 compute-0 nova_compute[183177]: 2026-01-26 20:02:37.622 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:37 compute-0 nova_compute[183177]: 2026-01-26 20:02:37.622 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:37 compute-0 nova_compute[183177]: 2026-01-26 20:02:37.622 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:39 compute-0 podman[213603]: 2026-01-26 20:02:39.316197322 +0000 UTC m=+0.061795138 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 20:02:40 compute-0 nova_compute[183177]: 2026-01-26 20:02:40.063 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:42 compute-0 nova_compute[183177]: 2026-01-26 20:02:42.624 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:42 compute-0 ovn_controller[95396]: 2026-01-26T20:02:42Z|00190|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Jan 26 20:02:44 compute-0 nova_compute[183177]: 2026-01-26 20:02:44.115 183181 DEBUG nova.compute.manager [req-469dbb3e-9dcc-4279-bc1e-b8e5a06ecf10 req-5e81851c-5eda-4485-b5a1-3fae61169eef 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-unplugged-73e3d3b5-4b31-402f-9d36-37f620d86504 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:02:44 compute-0 nova_compute[183177]: 2026-01-26 20:02:44.116 183181 DEBUG oslo_concurrency.lockutils [req-469dbb3e-9dcc-4279-bc1e-b8e5a06ecf10 req-5e81851c-5eda-4485-b5a1-3fae61169eef 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:44 compute-0 nova_compute[183177]: 2026-01-26 20:02:44.116 183181 DEBUG oslo_concurrency.lockutils [req-469dbb3e-9dcc-4279-bc1e-b8e5a06ecf10 req-5e81851c-5eda-4485-b5a1-3fae61169eef 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:44 compute-0 nova_compute[183177]: 2026-01-26 20:02:44.117 183181 DEBUG oslo_concurrency.lockutils [req-469dbb3e-9dcc-4279-bc1e-b8e5a06ecf10 req-5e81851c-5eda-4485-b5a1-3fae61169eef 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:44 compute-0 nova_compute[183177]: 2026-01-26 20:02:44.117 183181 DEBUG nova.compute.manager [req-469dbb3e-9dcc-4279-bc1e-b8e5a06ecf10 req-5e81851c-5eda-4485-b5a1-3fae61169eef 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] No event matching network-vif-unplugged-73e3d3b5-4b31-402f-9d36-37f620d86504 in dict_keys([('network-vif-plugged', '73e3d3b5-4b31-402f-9d36-37f620d86504')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 20:02:44 compute-0 nova_compute[183177]: 2026-01-26 20:02:44.118 183181 DEBUG nova.compute.manager [req-469dbb3e-9dcc-4279-bc1e-b8e5a06ecf10 req-5e81851c-5eda-4485-b5a1-3fae61169eef 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-unplugged-73e3d3b5-4b31-402f-9d36-37f620d86504 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:02:45 compute-0 nova_compute[183177]: 2026-01-26 20:02:45.073 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.144 183181 INFO nova.compute.manager [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Took 8.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.180 183181 DEBUG nova.compute.manager [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.181 183181 DEBUG oslo_concurrency.lockutils [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.181 183181 DEBUG oslo_concurrency.lockutils [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.181 183181 DEBUG oslo_concurrency.lockutils [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.181 183181 DEBUG nova.compute.manager [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Processing event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.182 183181 DEBUG nova.compute.manager [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-changed-73e3d3b5-4b31-402f-9d36-37f620d86504 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.182 183181 DEBUG nova.compute.manager [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Refreshing instance network info cache due to event network-changed-73e3d3b5-4b31-402f-9d36-37f620d86504. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.182 183181 DEBUG oslo_concurrency.lockutils [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-c1c1164e-2433-4d44-a423-73e2a317c3c1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.182 183181 DEBUG oslo_concurrency.lockutils [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-c1c1164e-2433-4d44-a423-73e2a317c3c1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.182 183181 DEBUG nova.network.neutron [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Refreshing network info cache for port 73e3d3b5-4b31-402f-9d36-37f620d86504 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.183 183181 DEBUG nova.compute.manager [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.689 183181 WARNING neutronclient.v2_0.client [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:02:46 compute-0 nova_compute[183177]: 2026-01-26 20:02:46.694 183181 DEBUG nova.compute.manager [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxsvnanhm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1c1164e-2433-4d44-a423-73e2a317c3c1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(b8631618-5997-4897-98a0-76cada1d9d5d),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 20:02:46 compute-0 sshd-session[213628]: Connection closed by authenticating user root 142.93.140.142 port 49086 [preauth]
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.209 183181 DEBUG nova.objects.instance [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid c1c1164e-2433-4d44-a423-73e2a317c3c1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.211 183181 DEBUG nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.214 183181 DEBUG nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.214 183181 DEBUG nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.428 183181 WARNING neutronclient.v2_0.client [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.625 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.657 183181 DEBUG nova.network.neutron [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Updated VIF entry in instance network info cache for port 73e3d3b5-4b31-402f-9d36-37f620d86504. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.658 183181 DEBUG nova.network.neutron [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Updating instance_info_cache with network_info: [{"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.718 183181 DEBUG nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.718 183181 DEBUG nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.724 183181 DEBUG nova.virt.libvirt.vif [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:01:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1320665426',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1320665426',id=24,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:01:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-9i1e2zf5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:01:53Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=c1c1164e-2433-4d44-a423-73e2a317c3c1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.724 183181 DEBUG nova.network.os_vif_util [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.725 183181 DEBUG nova.network.os_vif_util [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:59:1c,bridge_name='br-int',has_traffic_filtering=True,id=73e3d3b5-4b31-402f-9d36-37f620d86504,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e3d3b5-4b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.726 183181 DEBUG nova.virt.libvirt.migration [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <mac address="fa:16:3e:71:59:1c"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <model type="virtio"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <mtu size="1442"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <target dev="tap73e3d3b5-4b"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]: </interface>
Jan 26 20:02:47 compute-0 nova_compute[183177]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.726 183181 DEBUG nova.virt.libvirt.migration [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <name>instance-00000018</name>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <uuid>c1c1164e-2433-4d44-a423-73e2a317c3c1</uuid>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1320665426</nova:name>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:01:48</nova:creationTime>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:flavor name="tempest-watcher_flavor-1757377249" id="63d24b8a-64ab-43ac-be82-04ef5f355697">
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:memory>1151</nova:memory>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:extraSpecs/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:02:47 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:user uuid="7033feaa27a8427197df3725be3d1a7a">tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin</nova:user>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:project uuid="3ab7d887b45a437cabdface06e8a9be1">tempest-TestExecuteWorkloadBalanceStrategy-1111771467</nova:project>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:port uuid="73e3d3b5-4b31-402f-9d36-37f620d86504">
Jan 26 20:02:47 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <memory unit="KiB">1178624</memory>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">1178624</currentMemory>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <system>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="serial">c1c1164e-2433-4d44-a423-73e2a317c3c1</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="uuid">c1c1164e-2433-4d44-a423-73e2a317c3c1</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </system>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <os>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </os>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <features>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </features>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk.config"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:71:59:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap73e3d3b5-4b"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/console.log" append="off"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </target>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/console.log" append="off"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </console>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </input>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <video>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </video>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]: </domain>
Jan 26 20:02:47 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.729 183181 DEBUG nova.virt.libvirt.migration [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <name>instance-00000018</name>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <uuid>c1c1164e-2433-4d44-a423-73e2a317c3c1</uuid>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1320665426</nova:name>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:01:48</nova:creationTime>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:flavor name="tempest-watcher_flavor-1757377249" id="63d24b8a-64ab-43ac-be82-04ef5f355697">
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:memory>1151</nova:memory>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:extraSpecs/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:02:47 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:user uuid="7033feaa27a8427197df3725be3d1a7a">tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin</nova:user>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:project uuid="3ab7d887b45a437cabdface06e8a9be1">tempest-TestExecuteWorkloadBalanceStrategy-1111771467</nova:project>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:port uuid="73e3d3b5-4b31-402f-9d36-37f620d86504">
Jan 26 20:02:47 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <memory unit="KiB">1178624</memory>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">1178624</currentMemory>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <system>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="serial">c1c1164e-2433-4d44-a423-73e2a317c3c1</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="uuid">c1c1164e-2433-4d44-a423-73e2a317c3c1</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </system>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <os>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </os>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <features>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </features>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk.config"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:71:59:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap73e3d3b5-4b"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/console.log" append="off"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </target>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/console.log" append="off"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </console>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </input>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <video>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </video>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]: </domain>
Jan 26 20:02:47 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.730 183181 DEBUG nova.virt.libvirt.migration [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <name>instance-00000018</name>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <uuid>c1c1164e-2433-4d44-a423-73e2a317c3c1</uuid>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1320665426</nova:name>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:01:48</nova:creationTime>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:flavor name="tempest-watcher_flavor-1757377249" id="63d24b8a-64ab-43ac-be82-04ef5f355697">
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:memory>1151</nova:memory>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:extraSpecs/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:02:47 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:user uuid="7033feaa27a8427197df3725be3d1a7a">tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin</nova:user>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:project uuid="3ab7d887b45a437cabdface06e8a9be1">tempest-TestExecuteWorkloadBalanceStrategy-1111771467</nova:project>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <nova:port uuid="73e3d3b5-4b31-402f-9d36-37f620d86504">
Jan 26 20:02:47 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <memory unit="KiB">1178624</memory>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">1178624</currentMemory>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <system>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="serial">c1c1164e-2433-4d44-a423-73e2a317c3c1</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="uuid">c1c1164e-2433-4d44-a423-73e2a317c3c1</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </system>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <os>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </os>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <features>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </features>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/disk.config"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:71:59:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap73e3d3b5-4b"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/console.log" append="off"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:02:47 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       </target>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1/console.log" append="off"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </console>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </input>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <video>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </video>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:02:47 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:02:47 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:02:47 compute-0 nova_compute[183177]: </domain>
Jan 26 20:02:47 compute-0 nova_compute[183177]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 20:02:47 compute-0 nova_compute[183177]: 2026-01-26 20:02:47.731 183181 DEBUG nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 20:02:48 compute-0 nova_compute[183177]: 2026-01-26 20:02:48.168 183181 DEBUG oslo_concurrency.lockutils [req-44700c04-0a81-4f26-a3ad-9d3de5ef082b req-115a9a0c-7d9a-45dd-ac3c-1733e3e1df4f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-c1c1164e-2433-4d44-a423-73e2a317c3c1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:02:48 compute-0 nova_compute[183177]: 2026-01-26 20:02:48.221 183181 DEBUG nova.virt.libvirt.migration [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:02:48 compute-0 nova_compute[183177]: 2026-01-26 20:02:48.221 183181 INFO nova.virt.libvirt.migration [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 20:02:49 compute-0 nova_compute[183177]: 2026-01-26 20:02:49.243 183181 INFO nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 20:02:49 compute-0 nova_compute[183177]: 2026-01-26 20:02:49.747 183181 DEBUG nova.virt.libvirt.migration [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:02:49 compute-0 nova_compute[183177]: 2026-01-26 20:02:49.748 183181 DEBUG nova.virt.libvirt.migration [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 20:02:50 compute-0 kernel: tap73e3d3b5-4b (unregistering): left promiscuous mode
Jan 26 20:02:50 compute-0 NetworkManager[55489]: <info>  [1769457770.0556] device (tap73e3d3b5-4b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 20:02:50 compute-0 ovn_controller[95396]: 2026-01-26T20:02:50Z|00191|binding|INFO|Releasing lport 73e3d3b5-4b31-402f-9d36-37f620d86504 from this chassis (sb_readonly=0)
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.066 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:50 compute-0 ovn_controller[95396]: 2026-01-26T20:02:50Z|00192|binding|INFO|Setting lport 73e3d3b5-4b31-402f-9d36-37f620d86504 down in Southbound
Jan 26 20:02:50 compute-0 ovn_controller[95396]: 2026-01-26T20:02:50Z|00193|binding|INFO|Removing iface tap73e3d3b5-4b ovn-installed in OVS
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.069 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.074 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:59:1c 10.100.0.7'], port_security=['fa:16:3e:71:59:1c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c86a094a-ada8-46ad-9c23-c857e9a7b834'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c1c1164e-2433-4d44-a423-73e2a317c3c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ab7d887b45a437cabdface06e8a9be1', 'neutron:revision_number': '10', 'neutron:security_group_ids': '0cb4f9c1-bc93-4265-b41c-06f910025486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee375b04-bdd2-4f20-981a-bcd11f6bf341, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=73e3d3b5-4b31-402f-9d36-37f620d86504) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.075 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 73e3d3b5-4b31-402f-9d36-37f620d86504 in datapath d1220de0-89fa-4020-84a8-6d0a816a5b3c unbound from our chassis
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.076 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1220de0-89fa-4020-84a8-6d0a816a5b3c
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.077 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.080 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.103 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[52fb2edc-b795-4d0d-baa8-64c14e97f59f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:50 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000018.scope: Deactivated successfully.
Jan 26 20:02:50 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000018.scope: Consumed 15.640s CPU time.
Jan 26 20:02:50 compute-0 systemd-machined[154465]: Machine qemu-17-instance-00000018 terminated.
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.144 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[d0905cf5-74be-42be-a04e-c2cb1014644b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.149 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[e77b2faa-026e-43d0-8772-83d10d048ad9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.184 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[a1be7a96-d999-4810-a5e4-49abe65debff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.199 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[76bac26f-2cc8-41f9-964b-b39435c2ab09]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1220de0-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:e4:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517122, 'reachable_time': 25454, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213656, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.221 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[20eddba7-57af-439a-8474-d52e4773a958]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd1220de0-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517138, 'tstamp': 517138}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213657, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd1220de0-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517143, 'tstamp': 517143}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213657, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.222 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1220de0-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.224 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.229 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.229 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1220de0-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.229 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.230 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1220de0-80, col_values=(('external_ids', {'iface-id': '7a5d2ac3-15cd-4ece-b5ef-f600fe36ec74'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.230 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:02:50 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:02:50.231 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a10f01f3-c996-4f1a-a1f9-d92cf87950cc]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d1220de0-89fa-4020-84a8-6d0a816a5b3c\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d1220de0-89fa-4020-84a8-6d0a816a5b3c\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.301 183181 DEBUG nova.virt.libvirt.guest [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.302 183181 INFO nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Migration operation has completed
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.302 183181 INFO nova.compute.manager [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] _post_live_migration() is started..
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.305 183181 DEBUG nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.305 183181 DEBUG nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.305 183181 DEBUG nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.314 183181 WARNING neutronclient.v2_0.client [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.314 183181 WARNING neutronclient.v2_0.client [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.359 183181 DEBUG nova.compute.manager [req-8135a0a2-492d-4e51-855e-e7634a760369 req-e06e972a-ab08-4506-9850-e07f4224be3e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-unplugged-73e3d3b5-4b31-402f-9d36-37f620d86504 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.359 183181 DEBUG oslo_concurrency.lockutils [req-8135a0a2-492d-4e51-855e-e7634a760369 req-e06e972a-ab08-4506-9850-e07f4224be3e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.360 183181 DEBUG oslo_concurrency.lockutils [req-8135a0a2-492d-4e51-855e-e7634a760369 req-e06e972a-ab08-4506-9850-e07f4224be3e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.360 183181 DEBUG oslo_concurrency.lockutils [req-8135a0a2-492d-4e51-855e-e7634a760369 req-e06e972a-ab08-4506-9850-e07f4224be3e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.360 183181 DEBUG nova.compute.manager [req-8135a0a2-492d-4e51-855e-e7634a760369 req-e06e972a-ab08-4506-9850-e07f4224be3e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] No waiting events found dispatching network-vif-unplugged-73e3d3b5-4b31-402f-9d36-37f620d86504 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.361 183181 DEBUG nova.compute.manager [req-8135a0a2-492d-4e51-855e-e7634a760369 req-e06e972a-ab08-4506-9850-e07f4224be3e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-unplugged-73e3d3b5-4b31-402f-9d36-37f620d86504 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.764 183181 DEBUG nova.compute.manager [req-c6f3516e-6ca8-4745-acd7-a111040f1bf1 req-c4979065-3d9a-4b2f-9fd4-9c25d5b6897a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-unplugged-73e3d3b5-4b31-402f-9d36-37f620d86504 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.765 183181 DEBUG oslo_concurrency.lockutils [req-c6f3516e-6ca8-4745-acd7-a111040f1bf1 req-c4979065-3d9a-4b2f-9fd4-9c25d5b6897a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.765 183181 DEBUG oslo_concurrency.lockutils [req-c6f3516e-6ca8-4745-acd7-a111040f1bf1 req-c4979065-3d9a-4b2f-9fd4-9c25d5b6897a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.765 183181 DEBUG oslo_concurrency.lockutils [req-c6f3516e-6ca8-4745-acd7-a111040f1bf1 req-c4979065-3d9a-4b2f-9fd4-9c25d5b6897a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.766 183181 DEBUG nova.compute.manager [req-c6f3516e-6ca8-4745-acd7-a111040f1bf1 req-c4979065-3d9a-4b2f-9fd4-9c25d5b6897a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] No waiting events found dispatching network-vif-unplugged-73e3d3b5-4b31-402f-9d36-37f620d86504 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.766 183181 DEBUG nova.compute.manager [req-c6f3516e-6ca8-4745-acd7-a111040f1bf1 req-c4979065-3d9a-4b2f-9fd4-9c25d5b6897a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-unplugged-73e3d3b5-4b31-402f-9d36-37f620d86504 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.915 183181 DEBUG nova.network.neutron [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Activated binding for port 73e3d3b5-4b31-402f-9d36-37f620d86504 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.915 183181 DEBUG nova.compute.manager [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.916 183181 DEBUG nova.virt.libvirt.vif [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:01:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1320665426',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1320665426',id=24,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:01:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-9i1e2zf5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:02:27Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=c1c1164e-2433-4d44-a423-73e2a317c3c1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.916 183181 DEBUG nova.network.os_vif_util [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "73e3d3b5-4b31-402f-9d36-37f620d86504", "address": "fa:16:3e:71:59:1c", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73e3d3b5-4b", "ovs_interfaceid": "73e3d3b5-4b31-402f-9d36-37f620d86504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.917 183181 DEBUG nova.network.os_vif_util [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:59:1c,bridge_name='br-int',has_traffic_filtering=True,id=73e3d3b5-4b31-402f-9d36-37f620d86504,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e3d3b5-4b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.917 183181 DEBUG os_vif [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:59:1c,bridge_name='br-int',has_traffic_filtering=True,id=73e3d3b5-4b31-402f-9d36-37f620d86504,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e3d3b5-4b') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.919 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.919 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73e3d3b5-4b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.957 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.961 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.961 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.961 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e0ad11fe-d2c4-4cc2-8f7e-8026a0f093e7) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.962 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.963 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.965 183181 INFO os_vif [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:59:1c,bridge_name='br-int',has_traffic_filtering=True,id=73e3d3b5-4b31-402f-9d36-37f620d86504,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73e3d3b5-4b')
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.965 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.966 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.966 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.967 183181 DEBUG nova.compute.manager [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.967 183181 INFO nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Deleting instance files /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1_del
Jan 26 20:02:50 compute-0 nova_compute[183177]: 2026-01-26 20:02:50.968 183181 INFO nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Deletion of /var/lib/nova/instances/c1c1164e-2433-4d44-a423-73e2a317c3c1_del complete
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.455 183181 DEBUG nova.compute.manager [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.456 183181 DEBUG oslo_concurrency.lockutils [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.456 183181 DEBUG oslo_concurrency.lockutils [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.457 183181 DEBUG oslo_concurrency.lockutils [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.457 183181 DEBUG nova.compute.manager [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] No waiting events found dispatching network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.457 183181 WARNING nova.compute.manager [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received unexpected event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 for instance with vm_state active and task_state migrating.
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.458 183181 DEBUG nova.compute.manager [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-unplugged-73e3d3b5-4b31-402f-9d36-37f620d86504 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.458 183181 DEBUG oslo_concurrency.lockutils [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.459 183181 DEBUG oslo_concurrency.lockutils [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.459 183181 DEBUG oslo_concurrency.lockutils [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.460 183181 DEBUG nova.compute.manager [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] No waiting events found dispatching network-vif-unplugged-73e3d3b5-4b31-402f-9d36-37f620d86504 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.460 183181 DEBUG nova.compute.manager [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-unplugged-73e3d3b5-4b31-402f-9d36-37f620d86504 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.460 183181 DEBUG nova.compute.manager [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.461 183181 DEBUG oslo_concurrency.lockutils [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.461 183181 DEBUG oslo_concurrency.lockutils [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.462 183181 DEBUG oslo_concurrency.lockutils [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.462 183181 DEBUG nova.compute.manager [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] No waiting events found dispatching network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.462 183181 WARNING nova.compute.manager [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received unexpected event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 for instance with vm_state active and task_state migrating.
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.463 183181 DEBUG nova.compute.manager [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.463 183181 DEBUG oslo_concurrency.lockutils [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.464 183181 DEBUG oslo_concurrency.lockutils [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.464 183181 DEBUG oslo_concurrency.lockutils [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.464 183181 DEBUG nova.compute.manager [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] No waiting events found dispatching network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.465 183181 WARNING nova.compute.manager [req-474f85aa-6dd6-4e4e-9cab-d61e565f0b6a req-7db17ba1-07db-43d1-9655-f1b466eae33b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Received unexpected event network-vif-plugged-73e3d3b5-4b31-402f-9d36-37f620d86504 for instance with vm_state active and task_state migrating.
Jan 26 20:02:52 compute-0 nova_compute[183177]: 2026-01-26 20:02:52.627 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:55 compute-0 nova_compute[183177]: 2026-01-26 20:02:55.993 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:57 compute-0 nova_compute[183177]: 2026-01-26 20:02:57.630 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:02:58 compute-0 nova_compute[183177]: 2026-01-26 20:02:58.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:02:59 compute-0 podman[192499]: time="2026-01-26T20:02:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:02:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:02:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:02:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:02:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Jan 26 20:03:00 compute-0 nova_compute[183177]: 2026-01-26 20:03:00.007 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:03:00 compute-0 nova_compute[183177]: 2026-01-26 20:03:00.008 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:03:00 compute-0 nova_compute[183177]: 2026-01-26 20:03:00.009 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "c1c1164e-2433-4d44-a423-73e2a317c3c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:03:00 compute-0 nova_compute[183177]: 2026-01-26 20:03:00.522 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:03:00 compute-0 nova_compute[183177]: 2026-01-26 20:03:00.523 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:03:00 compute-0 nova_compute[183177]: 2026-01-26 20:03:00.523 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:03:00 compute-0 nova_compute[183177]: 2026-01-26 20:03:00.523 183181 DEBUG nova.compute.resource_tracker [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:03:01 compute-0 nova_compute[183177]: 2026-01-26 20:03:01.005 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:01 compute-0 openstack_network_exporter[195363]: ERROR   20:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:03:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:03:01 compute-0 openstack_network_exporter[195363]: ERROR   20:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:03:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:03:01 compute-0 nova_compute[183177]: 2026-01-26 20:03:01.579 183181 DEBUG oslo_concurrency.processutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:03:01 compute-0 nova_compute[183177]: 2026-01-26 20:03:01.659 183181 DEBUG oslo_concurrency.processutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:03:01 compute-0 nova_compute[183177]: 2026-01-26 20:03:01.660 183181 DEBUG oslo_concurrency.processutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:03:01 compute-0 nova_compute[183177]: 2026-01-26 20:03:01.749 183181 DEBUG oslo_concurrency.processutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:03:01 compute-0 nova_compute[183177]: 2026-01-26 20:03:01.955 183181 WARNING nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:03:01 compute-0 nova_compute[183177]: 2026-01-26 20:03:01.957 183181 DEBUG oslo_concurrency.processutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:03:01 compute-0 nova_compute[183177]: 2026-01-26 20:03:01.998 183181 DEBUG oslo_concurrency.processutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:03:01 compute-0 nova_compute[183177]: 2026-01-26 20:03:01.998 183181 DEBUG nova.compute.resource_tracker [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5506MB free_disk=73.06892776489258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:03:01 compute-0 nova_compute[183177]: 2026-01-26 20:03:01.999 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:03:01 compute-0 nova_compute[183177]: 2026-01-26 20:03:01.999 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:03:02 compute-0 podman[213686]: 2026-01-26 20:03:02.362416942 +0000 UTC m=+0.090766451 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 20:03:02 compute-0 podman[213685]: 2026-01-26 20:03:02.372612973 +0000 UTC m=+0.111164484 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container)
Jan 26 20:03:02 compute-0 podman[213684]: 2026-01-26 20:03:02.402741185 +0000 UTC m=+0.140781472 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260120, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 20:03:02 compute-0 nova_compute[183177]: 2026-01-26 20:03:02.633 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:03 compute-0 nova_compute[183177]: 2026-01-26 20:03:03.026 183181 DEBUG nova.compute.resource_tracker [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance c1c1164e-2433-4d44-a423-73e2a317c3c1 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 20:03:03 compute-0 nova_compute[183177]: 2026-01-26 20:03:03.540 183181 DEBUG nova.compute.resource_tracker [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 20:03:03 compute-0 nova_compute[183177]: 2026-01-26 20:03:03.619 183181 DEBUG nova.compute.resource_tracker [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Instance e905a08b-37ed-4341-8762-bf8b43688cc5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 20:03:03 compute-0 nova_compute[183177]: 2026-01-26 20:03:03.619 183181 DEBUG nova.compute.resource_tracker [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration b8631618-5997-4897-98a0-76cada1d9d5d is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 1151, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:03:03 compute-0 nova_compute[183177]: 2026-01-26 20:03:03.620 183181 DEBUG nova.compute.resource_tracker [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:03:03 compute-0 nova_compute[183177]: 2026-01-26 20:03:03.620 183181 DEBUG nova.compute.resource_tracker [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1663MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:03:01 up  1:27,  0 user,  load average: 0.39, 0.30, 0.29\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_3ab7d887b45a437cabdface06e8a9be1': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:03:03 compute-0 sshd-session[213747]: Connection closed by authenticating user root 188.166.116.149 port 34320 [preauth]
Jan 26 20:03:03 compute-0 nova_compute[183177]: 2026-01-26 20:03:03.746 183181 DEBUG nova.compute.provider_tree [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.125 183181 DEBUG oslo_concurrency.lockutils [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "e905a08b-37ed-4341-8762-bf8b43688cc5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.126 183181 DEBUG oslo_concurrency.lockutils [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.127 183181 DEBUG oslo_concurrency.lockutils [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.127 183181 DEBUG oslo_concurrency.lockutils [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.127 183181 DEBUG oslo_concurrency.lockutils [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.149 183181 INFO nova.compute.manager [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Terminating instance
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.253 183181 DEBUG nova.scheduler.client.report [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.666 183181 DEBUG nova.compute.manager [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 20:03:04 compute-0 kernel: tapf5880f43-58 (unregistering): left promiscuous mode
Jan 26 20:03:04 compute-0 NetworkManager[55489]: <info>  [1769457784.6944] device (tapf5880f43-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.702 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:04 compute-0 ovn_controller[95396]: 2026-01-26T20:03:04Z|00194|binding|INFO|Releasing lport f5880f43-5845-4174-b639-e51cab7c6f2f from this chassis (sb_readonly=0)
Jan 26 20:03:04 compute-0 ovn_controller[95396]: 2026-01-26T20:03:04Z|00195|binding|INFO|Setting lport f5880f43-5845-4174-b639-e51cab7c6f2f down in Southbound
Jan 26 20:03:04 compute-0 ovn_controller[95396]: 2026-01-26T20:03:04Z|00196|binding|INFO|Removing iface tapf5880f43-58 ovn-installed in OVS
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.707 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:04.711 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:ef:83 10.100.0.3'], port_security=['fa:16:3e:7e:ef:83 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e905a08b-37ed-4341-8762-bf8b43688cc5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ab7d887b45a437cabdface06e8a9be1', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0cb4f9c1-bc93-4265-b41c-06f910025486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee375b04-bdd2-4f20-981a-bcd11f6bf341, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=f5880f43-5845-4174-b639-e51cab7c6f2f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:03:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:04.712 104672 INFO neutron.agent.ovn.metadata.agent [-] Port f5880f43-5845-4174-b639-e51cab7c6f2f in datapath d1220de0-89fa-4020-84a8-6d0a816a5b3c unbound from our chassis
Jan 26 20:03:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:04.712 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1220de0-89fa-4020-84a8-6d0a816a5b3c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 20:03:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:04.713 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3db0b769-852f-4d9f-a785-0be84f98771f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:03:04 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:04.714 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c namespace which is not needed anymore
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.734 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:04 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000019.scope: Deactivated successfully.
Jan 26 20:03:04 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000019.scope: Consumed 14.869s CPU time.
Jan 26 20:03:04 compute-0 systemd-machined[154465]: Machine qemu-18-instance-00000019 terminated.
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.764 183181 DEBUG nova.compute.resource_tracker [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.766 183181 DEBUG oslo_concurrency.lockutils [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.767s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.788 183181 INFO nova.compute.manager [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 26 20:03:04 compute-0 podman[213774]: 2026-01-26 20:03:04.876116804 +0000 UTC m=+0.042395472 container kill fbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 20:03:04 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[213329]: [NOTICE]   (213333) : haproxy version is 3.0.5-8e879a5
Jan 26 20:03:04 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[213329]: [NOTICE]   (213333) : path to executable is /usr/sbin/haproxy
Jan 26 20:03:04 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[213329]: [WARNING]  (213333) : Exiting Master process...
Jan 26 20:03:04 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[213329]: [ALERT]    (213333) : Current worker (213335) exited with code 143 (Terminated)
Jan 26 20:03:04 compute-0 neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c[213329]: [WARNING]  (213333) : All workers exited. Exiting... (0)
Jan 26 20:03:04 compute-0 systemd[1]: libpod-fbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5.scope: Deactivated successfully.
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.938 183181 INFO nova.virt.libvirt.driver [-] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Instance destroyed successfully.
Jan 26 20:03:04 compute-0 nova_compute[183177]: 2026-01-26 20:03:04.939 183181 DEBUG nova.objects.instance [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lazy-loading 'resources' on Instance uuid e905a08b-37ed-4341-8762-bf8b43688cc5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:03:04 compute-0 podman[213792]: 2026-01-26 20:03:04.946748995 +0000 UTC m=+0.042451214 container died fbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 20:03:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5-userdata-shm.mount: Deactivated successfully.
Jan 26 20:03:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-050912dfac62a60f414d1233ac7b1167d762b59c831e88d1eb6cb9a39e2f5c7a-merged.mount: Deactivated successfully.
Jan 26 20:03:04 compute-0 podman[213792]: 2026-01-26 20:03:04.992433465 +0000 UTC m=+0.088135654 container cleanup fbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 20:03:05 compute-0 systemd[1]: libpod-conmon-fbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5.scope: Deactivated successfully.
Jan 26 20:03:05 compute-0 podman[213799]: 2026-01-26 20:03:05.011829357 +0000 UTC m=+0.097242449 container remove fbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120)
Jan 26 20:03:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:05.020 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d06881d2-c757-4c49-bb4a-09292dde89fb]: (4, ("Mon Jan 26 08:03:04 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c (fbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5)\nfbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5\nMon Jan 26 08:03:04 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c (fbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5)\nfbbbcbadd050a6c63849e65d032b549a40e25df182b98aa902290be0a559b9e5\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:03:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:05.022 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[03d5136d-098d-41cb-81bc-eecb5751480c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:03:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:05.022 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1220de0-89fa-4020-84a8-6d0a816a5b3c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:03:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:05.023 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab32514-d7ff-4584-a9e9-e84669afbd59]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:03:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:05.023 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1220de0-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.025 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:05 compute-0 kernel: tapd1220de0-80: left promiscuous mode
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.039 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:05.043 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8ff650d0-c249-4d0f-96bf-a467d8b6fa40]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.045 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:05.061 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[504c9cf8-05d8-47c5-9d2c-1b6f252a0c0d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:03:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:05.062 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d6674522-c8c7-48d8-8d95-8600f769e5d1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:03:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:05.078 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[ce5a7ec1-7cf9-47c1-98a3-742ad457c38f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517113, 'reachable_time': 42232, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213841, 'error': None, 'target': 'ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:03:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:05.080 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d1220de0-89fa-4020-84a8-6d0a816a5b3c deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 20:03:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:05.081 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[918bf375-b5c0-4847-ba6d-0735a452e59c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:03:05 compute-0 systemd[1]: run-netns-ovnmeta\x2dd1220de0\x2d89fa\x2d4020\x2d84a8\x2d6d0a816a5b3c.mount: Deactivated successfully.
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.369 183181 DEBUG nova.compute.manager [req-4a5b2f22-743f-4858-a183-885b15f12b58 req-6d9b4131-84e8-4416-9954-dac3dd70323f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Received event network-vif-unplugged-f5880f43-5845-4174-b639-e51cab7c6f2f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.370 183181 DEBUG oslo_concurrency.lockutils [req-4a5b2f22-743f-4858-a183-885b15f12b58 req-6d9b4131-84e8-4416-9954-dac3dd70323f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.370 183181 DEBUG oslo_concurrency.lockutils [req-4a5b2f22-743f-4858-a183-885b15f12b58 req-6d9b4131-84e8-4416-9954-dac3dd70323f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.371 183181 DEBUG oslo_concurrency.lockutils [req-4a5b2f22-743f-4858-a183-885b15f12b58 req-6d9b4131-84e8-4416-9954-dac3dd70323f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.371 183181 DEBUG nova.compute.manager [req-4a5b2f22-743f-4858-a183-885b15f12b58 req-6d9b4131-84e8-4416-9954-dac3dd70323f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] No waiting events found dispatching network-vif-unplugged-f5880f43-5845-4174-b639-e51cab7c6f2f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.372 183181 DEBUG nova.compute.manager [req-4a5b2f22-743f-4858-a183-885b15f12b58 req-6d9b4131-84e8-4416-9954-dac3dd70323f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Received event network-vif-unplugged-f5880f43-5845-4174-b639-e51cab7c6f2f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.445 183181 DEBUG nova.virt.libvirt.vif [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:01:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-2125674647',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-2125674647',id=25,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:02:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ab7d887b45a437cabdface06e8a9be1',ramdisk_id='',reservation_id='r-vcs629i0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1111771467-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:02:14Z,user_data=None,user_id='7033feaa27a8427197df3725be3d1a7a',uuid=e905a08b-37ed-4341-8762-bf8b43688cc5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f5880f43-5845-4174-b639-e51cab7c6f2f", "address": "fa:16:3e:7e:ef:83", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf5880f43-58", "ovs_interfaceid": "f5880f43-5845-4174-b639-e51cab7c6f2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.446 183181 DEBUG nova.network.os_vif_util [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converting VIF {"id": "f5880f43-5845-4174-b639-e51cab7c6f2f", "address": "fa:16:3e:7e:ef:83", "network": {"id": "d1220de0-89fa-4020-84a8-6d0a816a5b3c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-8486886-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "351c14c8e6aa42869583e2a01f2ea90f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf5880f43-58", "ovs_interfaceid": "f5880f43-5845-4174-b639-e51cab7c6f2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.447 183181 DEBUG nova.network.os_vif_util [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:ef:83,bridge_name='br-int',has_traffic_filtering=True,id=f5880f43-5845-4174-b639-e51cab7c6f2f,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf5880f43-58') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.448 183181 DEBUG os_vif [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:ef:83,bridge_name='br-int',has_traffic_filtering=True,id=f5880f43-5845-4174-b639-e51cab7c6f2f,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf5880f43-58') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.451 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.452 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5880f43-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.454 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.456 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.457 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.458 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b0f7ffdf-7745-4654-8a14-613696967cd1) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.459 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.460 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.463 183181 INFO os_vif [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:ef:83,bridge_name='br-int',has_traffic_filtering=True,id=f5880f43-5845-4174-b639-e51cab7c6f2f,network=Network(d1220de0-89fa-4020-84a8-6d0a816a5b3c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf5880f43-58')
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.464 183181 INFO nova.virt.libvirt.driver [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Deleting instance files /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5_del
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.465 183181 INFO nova.virt.libvirt.driver [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Deletion of /var/lib/nova/instances/e905a08b-37ed-4341-8762-bf8b43688cc5_del complete
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.668 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.876 183181 INFO nova.scheduler.client.report [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration b8631618-5997-4897-98a0-76cada1d9d5d
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.876 183181 DEBUG nova.virt.libvirt.driver [None req-f26ecc14-bdff-4a5a-81eb-6e9ab1914756 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: c1c1164e-2433-4d44-a423-73e2a317c3c1] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.980 183181 INFO nova.compute.manager [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Took 1.31 seconds to destroy the instance on the hypervisor.
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.980 183181 DEBUG oslo.service.backend._eventlet.loopingcall [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.980 183181 DEBUG nova.compute.manager [-] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.981 183181 DEBUG nova.network.neutron [-] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 20:03:05 compute-0 nova_compute[183177]: 2026-01-26 20:03:05.981 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:03:06 compute-0 nova_compute[183177]: 2026-01-26 20:03:06.198 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:03:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:06.556 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:03:06 compute-0 nova_compute[183177]: 2026-01-26 20:03:06.557 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:06.558 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:03:06 compute-0 nova_compute[183177]: 2026-01-26 20:03:06.657 183181 DEBUG nova.compute.manager [req-e6481d9f-543f-4c57-8c66-9441db841b72 req-eb8d9b54-b42e-4466-b204-d6fd9d686c45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Received event network-vif-deleted-f5880f43-5845-4174-b639-e51cab7c6f2f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:03:06 compute-0 nova_compute[183177]: 2026-01-26 20:03:06.658 183181 INFO nova.compute.manager [req-e6481d9f-543f-4c57-8c66-9441db841b72 req-eb8d9b54-b42e-4466-b204-d6fd9d686c45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Neutron deleted interface f5880f43-5845-4174-b639-e51cab7c6f2f; detaching it from the instance and deleting it from the info cache
Jan 26 20:03:06 compute-0 nova_compute[183177]: 2026-01-26 20:03:06.658 183181 DEBUG nova.network.neutron [req-e6481d9f-543f-4c57-8c66-9441db841b72 req-eb8d9b54-b42e-4466-b204-d6fd9d686c45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:03:07 compute-0 nova_compute[183177]: 2026-01-26 20:03:07.107 183181 DEBUG nova.network.neutron [-] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:03:07 compute-0 nova_compute[183177]: 2026-01-26 20:03:07.167 183181 DEBUG nova.compute.manager [req-e6481d9f-543f-4c57-8c66-9441db841b72 req-eb8d9b54-b42e-4466-b204-d6fd9d686c45 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Detach interface failed, port_id=f5880f43-5845-4174-b639-e51cab7c6f2f, reason: Instance e905a08b-37ed-4341-8762-bf8b43688cc5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 20:03:07 compute-0 nova_compute[183177]: 2026-01-26 20:03:07.451 183181 DEBUG nova.compute.manager [req-269f76d0-61a2-4815-90c3-70b2ea4b9090 req-b38da48a-e4eb-4d8b-858c-b368b4cdd70b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Received event network-vif-unplugged-f5880f43-5845-4174-b639-e51cab7c6f2f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:03:07 compute-0 nova_compute[183177]: 2026-01-26 20:03:07.451 183181 DEBUG oslo_concurrency.lockutils [req-269f76d0-61a2-4815-90c3-70b2ea4b9090 req-b38da48a-e4eb-4d8b-858c-b368b4cdd70b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:03:07 compute-0 nova_compute[183177]: 2026-01-26 20:03:07.451 183181 DEBUG oslo_concurrency.lockutils [req-269f76d0-61a2-4815-90c3-70b2ea4b9090 req-b38da48a-e4eb-4d8b-858c-b368b4cdd70b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:03:07 compute-0 nova_compute[183177]: 2026-01-26 20:03:07.452 183181 DEBUG oslo_concurrency.lockutils [req-269f76d0-61a2-4815-90c3-70b2ea4b9090 req-b38da48a-e4eb-4d8b-858c-b368b4cdd70b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:03:07 compute-0 nova_compute[183177]: 2026-01-26 20:03:07.452 183181 DEBUG nova.compute.manager [req-269f76d0-61a2-4815-90c3-70b2ea4b9090 req-b38da48a-e4eb-4d8b-858c-b368b4cdd70b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] No waiting events found dispatching network-vif-unplugged-f5880f43-5845-4174-b639-e51cab7c6f2f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:03:07 compute-0 nova_compute[183177]: 2026-01-26 20:03:07.452 183181 DEBUG nova.compute.manager [req-269f76d0-61a2-4815-90c3-70b2ea4b9090 req-b38da48a-e4eb-4d8b-858c-b368b4cdd70b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Received event network-vif-unplugged-f5880f43-5845-4174-b639-e51cab7c6f2f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:03:07 compute-0 nova_compute[183177]: 2026-01-26 20:03:07.613 183181 INFO nova.compute.manager [-] [instance: e905a08b-37ed-4341-8762-bf8b43688cc5] Took 1.63 seconds to deallocate network for instance.
Jan 26 20:03:07 compute-0 nova_compute[183177]: 2026-01-26 20:03:07.670 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:08 compute-0 nova_compute[183177]: 2026-01-26 20:03:08.149 183181 DEBUG oslo_concurrency.lockutils [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:03:08 compute-0 nova_compute[183177]: 2026-01-26 20:03:08.150 183181 DEBUG oslo_concurrency.lockutils [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:03:08 compute-0 nova_compute[183177]: 2026-01-26 20:03:08.211 183181 DEBUG nova.compute.provider_tree [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:03:08 compute-0 nova_compute[183177]: 2026-01-26 20:03:08.664 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:03:08 compute-0 nova_compute[183177]: 2026-01-26 20:03:08.665 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:03:08 compute-0 nova_compute[183177]: 2026-01-26 20:03:08.722 183181 DEBUG nova.scheduler.client.report [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:03:09 compute-0 nova_compute[183177]: 2026-01-26 20:03:09.182 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:03:09 compute-0 nova_compute[183177]: 2026-01-26 20:03:09.233 183181 DEBUG oslo_concurrency.lockutils [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.083s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:03:09 compute-0 nova_compute[183177]: 2026-01-26 20:03:09.236 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.054s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:03:09 compute-0 nova_compute[183177]: 2026-01-26 20:03:09.236 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:03:09 compute-0 nova_compute[183177]: 2026-01-26 20:03:09.237 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:03:09 compute-0 nova_compute[183177]: 2026-01-26 20:03:09.257 183181 INFO nova.scheduler.client.report [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Deleted allocations for instance e905a08b-37ed-4341-8762-bf8b43688cc5
Jan 26 20:03:09 compute-0 nova_compute[183177]: 2026-01-26 20:03:09.456 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:03:09 compute-0 nova_compute[183177]: 2026-01-26 20:03:09.458 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:03:09 compute-0 nova_compute[183177]: 2026-01-26 20:03:09.500 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:03:09 compute-0 nova_compute[183177]: 2026-01-26 20:03:09.501 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5711MB free_disk=73.09809875488281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:03:09 compute-0 nova_compute[183177]: 2026-01-26 20:03:09.501 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:03:09 compute-0 nova_compute[183177]: 2026-01-26 20:03:09.501 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:03:10 compute-0 nova_compute[183177]: 2026-01-26 20:03:10.295 183181 DEBUG oslo_concurrency.lockutils [None req-7aae1bfe-d893-453d-98a4-e78e9a1ff11f 7033feaa27a8427197df3725be3d1a7a 3ab7d887b45a437cabdface06e8a9be1 - - default default] Lock "e905a08b-37ed-4341-8762-bf8b43688cc5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.169s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:03:10 compute-0 podman[213845]: 2026-01-26 20:03:10.322030895 +0000 UTC m=+0.072377869 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 20:03:10 compute-0 nova_compute[183177]: 2026-01-26 20:03:10.459 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:10 compute-0 nova_compute[183177]: 2026-01-26 20:03:10.532 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:03:10 compute-0 nova_compute[183177]: 2026-01-26 20:03:10.532 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:03:09 up  1:27,  0 user,  load average: 0.33, 0.29, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:03:10 compute-0 nova_compute[183177]: 2026-01-26 20:03:10.550 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:03:11 compute-0 nova_compute[183177]: 2026-01-26 20:03:11.057 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:03:11 compute-0 nova_compute[183177]: 2026-01-26 20:03:11.568 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:03:11 compute-0 nova_compute[183177]: 2026-01-26 20:03:11.569 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.067s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:03:12 compute-0 nova_compute[183177]: 2026-01-26 20:03:12.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:03:12 compute-0 nova_compute[183177]: 2026-01-26 20:03:12.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:03:12 compute-0 nova_compute[183177]: 2026-01-26 20:03:12.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:03:12 compute-0 nova_compute[183177]: 2026-01-26 20:03:12.154 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 20:03:12 compute-0 nova_compute[183177]: 2026-01-26 20:03:12.707 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:13 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:13.559 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:03:13 compute-0 nova_compute[183177]: 2026-01-26 20:03:13.661 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:03:15 compute-0 nova_compute[183177]: 2026-01-26 20:03:15.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:03:15 compute-0 nova_compute[183177]: 2026-01-26 20:03:15.462 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:16 compute-0 nova_compute[183177]: 2026-01-26 20:03:16.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:03:17 compute-0 nova_compute[183177]: 2026-01-26 20:03:17.748 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:19 compute-0 sshd-session[213869]: Invalid user sgf from 193.32.162.151 port 48994
Jan 26 20:03:20 compute-0 sshd-session[213869]: Connection closed by invalid user sgf 193.32.162.151 port 48994 [preauth]
Jan 26 20:03:20 compute-0 nova_compute[183177]: 2026-01-26 20:03:20.464 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:22 compute-0 nova_compute[183177]: 2026-01-26 20:03:22.782 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:23 compute-0 nova_compute[183177]: 2026-01-26 20:03:23.746 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:24.088 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:03:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:24.089 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:03:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:24.089 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:03:24 compute-0 sshd-session[213871]: Connection closed by authenticating user root 142.93.140.142 port 53840 [preauth]
Jan 26 20:03:25 compute-0 nova_compute[183177]: 2026-01-26 20:03:25.466 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:27 compute-0 nova_compute[183177]: 2026-01-26 20:03:27.783 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:29 compute-0 podman[192499]: time="2026-01-26T20:03:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:03:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:03:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:03:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:03:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 26 20:03:30 compute-0 nova_compute[183177]: 2026-01-26 20:03:30.468 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:31 compute-0 openstack_network_exporter[195363]: ERROR   20:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:03:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:03:31 compute-0 openstack_network_exporter[195363]: ERROR   20:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:03:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:03:32 compute-0 nova_compute[183177]: 2026-01-26 20:03:32.785 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:33 compute-0 nova_compute[183177]: 2026-01-26 20:03:33.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:03:33 compute-0 podman[213876]: 2026-01-26 20:03:33.319510593 +0000 UTC m=+0.062754101 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 26 20:03:33 compute-0 podman[213875]: 2026-01-26 20:03:33.340936709 +0000 UTC m=+0.080005724 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Jan 26 20:03:33 compute-0 podman[213874]: 2026-01-26 20:03:33.399066813 +0000 UTC m=+0.147365087 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 26 20:03:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:35.369 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:c2:e0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3354485a76de41f592c90f9741b8c515', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8598e857-e8d7-4ecf-97dd-17cf2e766e3a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=63338c40-169b-4962-a6c8-8ca20b375080) old=Port_Binding(mac=['fa:16:3e:f7:c2:e0'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3354485a76de41f592c90f9741b8c515', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:03:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:35.370 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 63338c40-169b-4962-a6c8-8ca20b375080 in datapath d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c updated
Jan 26 20:03:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:35.371 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 20:03:35 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:35.372 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d9511bcf-55d0-4264-bb40-f82bc79a2916]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:03:35 compute-0 nova_compute[183177]: 2026-01-26 20:03:35.514 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:37 compute-0 nova_compute[183177]: 2026-01-26 20:03:37.824 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:40 compute-0 nova_compute[183177]: 2026-01-26 20:03:40.515 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:41 compute-0 podman[213941]: 2026-01-26 20:03:41.334017295 +0000 UTC m=+0.078380871 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 20:03:41 compute-0 sshd-session[213939]: Connection closed by authenticating user root 188.166.116.149 port 47470 [preauth]
Jan 26 20:03:42 compute-0 nova_compute[183177]: 2026-01-26 20:03:42.865 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:44 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:44.290 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:0d:3f 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-dfd15f13-3a5c-49dd-9d39-2b8aeb7a6266', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dfd15f13-3a5c-49dd-9d39-2b8aeb7a6266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d30bc5631f24a6799364d53cb4e9465', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b399cb3d-7ceb-4a3d-907d-f17b7aaa638f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=53c17ff6-5c94-485b-b61f-60710ac3678f) old=Port_Binding(mac=['fa:16:3e:2c:0d:3f'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-dfd15f13-3a5c-49dd-9d39-2b8aeb7a6266', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dfd15f13-3a5c-49dd-9d39-2b8aeb7a6266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d30bc5631f24a6799364d53cb4e9465', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:03:44 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:44.291 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 53c17ff6-5c94-485b-b61f-60710ac3678f in datapath dfd15f13-3a5c-49dd-9d39-2b8aeb7a6266 updated
Jan 26 20:03:44 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:44.292 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dfd15f13-3a5c-49dd-9d39-2b8aeb7a6266, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 20:03:44 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:03:44.293 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc44185-c32e-4181-8942-9d9914fc311b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:03:45 compute-0 nova_compute[183177]: 2026-01-26 20:03:45.517 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:47 compute-0 nova_compute[183177]: 2026-01-26 20:03:47.922 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:50 compute-0 nova_compute[183177]: 2026-01-26 20:03:50.519 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:52 compute-0 nova_compute[183177]: 2026-01-26 20:03:52.962 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:55 compute-0 nova_compute[183177]: 2026-01-26 20:03:55.521 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:55 compute-0 nova_compute[183177]: 2026-01-26 20:03:55.944 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:03:55 compute-0 nova_compute[183177]: 2026-01-26 20:03:55.944 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:03:56 compute-0 nova_compute[183177]: 2026-01-26 20:03:56.451 183181 DEBUG nova.compute.manager [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 20:03:57 compute-0 nova_compute[183177]: 2026-01-26 20:03:57.024 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:03:57 compute-0 nova_compute[183177]: 2026-01-26 20:03:57.025 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:03:57 compute-0 nova_compute[183177]: 2026-01-26 20:03:57.035 183181 DEBUG nova.virt.hardware [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 20:03:57 compute-0 nova_compute[183177]: 2026-01-26 20:03:57.036 183181 INFO nova.compute.claims [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Claim successful on node compute-0.ctlplane.example.com
Jan 26 20:03:57 compute-0 ovn_controller[95396]: 2026-01-26T20:03:57Z|00197|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 20:03:57 compute-0 nova_compute[183177]: 2026-01-26 20:03:57.965 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:03:58 compute-0 nova_compute[183177]: 2026-01-26 20:03:58.117 183181 DEBUG nova.compute.provider_tree [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:03:58 compute-0 nova_compute[183177]: 2026-01-26 20:03:58.626 183181 DEBUG nova.scheduler.client.report [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:03:59 compute-0 nova_compute[183177]: 2026-01-26 20:03:59.140 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:03:59 compute-0 nova_compute[183177]: 2026-01-26 20:03:59.141 183181 DEBUG nova.compute.manager [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 20:03:59 compute-0 nova_compute[183177]: 2026-01-26 20:03:59.659 183181 DEBUG nova.compute.manager [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 20:03:59 compute-0 nova_compute[183177]: 2026-01-26 20:03:59.659 183181 DEBUG nova.network.neutron [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 20:03:59 compute-0 nova_compute[183177]: 2026-01-26 20:03:59.661 183181 WARNING neutronclient.v2_0.client [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:03:59 compute-0 nova_compute[183177]: 2026-01-26 20:03:59.662 183181 WARNING neutronclient.v2_0.client [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:03:59 compute-0 podman[192499]: time="2026-01-26T20:03:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:03:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:03:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:03:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:03:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Jan 26 20:04:00 compute-0 nova_compute[183177]: 2026-01-26 20:04:00.178 183181 INFO nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 20:04:00 compute-0 nova_compute[183177]: 2026-01-26 20:04:00.521 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:00 compute-0 nova_compute[183177]: 2026-01-26 20:04:00.676 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:04:00 compute-0 nova_compute[183177]: 2026-01-26 20:04:00.688 183181 DEBUG nova.compute.manager [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 20:04:01 compute-0 sshd-session[213966]: Connection closed by authenticating user root 142.93.140.142 port 44728 [preauth]
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.261 183181 DEBUG nova.network.neutron [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Successfully created port: 3c2d86fb-aa08-415f-878e-b675f3b8a580 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 20:04:01 compute-0 openstack_network_exporter[195363]: ERROR   20:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:04:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:04:01 compute-0 openstack_network_exporter[195363]: ERROR   20:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:04:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.712 183181 DEBUG nova.compute.manager [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.713 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.714 183181 INFO nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Creating image(s)
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.715 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.715 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.716 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.716 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.721 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.724 183181 DEBUG oslo_concurrency.processutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.794 183181 DEBUG oslo_concurrency.processutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.795 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.796 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.796 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.799 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.800 183181 DEBUG oslo_concurrency.processutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.865 183181 DEBUG oslo_concurrency.processutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.866 183181 DEBUG oslo_concurrency.processutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.899 183181 DEBUG oslo_concurrency.processutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.900 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.901 183181 DEBUG oslo_concurrency.processutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.951 183181 DEBUG oslo_concurrency.processutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.952 183181 DEBUG nova.virt.disk.api [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Checking if we can resize image /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 20:04:01 compute-0 nova_compute[183177]: 2026-01-26 20:04:01.953 183181 DEBUG oslo_concurrency.processutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.000 183181 DEBUG oslo_concurrency.processutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.001 183181 DEBUG nova.virt.disk.api [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Cannot resize image /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.001 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.001 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Ensure instance console log exists: /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.002 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.002 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.002 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.556 183181 DEBUG nova.network.neutron [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Successfully updated port: 3c2d86fb-aa08-415f-878e-b675f3b8a580 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.654 183181 DEBUG nova.compute.manager [req-620c51b5-3494-40f3-9caf-3bd2d7d84ab2 req-859c440e-5af3-4695-b2e1-0960f14de154 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-changed-3c2d86fb-aa08-415f-878e-b675f3b8a580 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.655 183181 DEBUG nova.compute.manager [req-620c51b5-3494-40f3-9caf-3bd2d7d84ab2 req-859c440e-5af3-4695-b2e1-0960f14de154 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Refreshing instance network info cache due to event network-changed-3c2d86fb-aa08-415f-878e-b675f3b8a580. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.655 183181 DEBUG oslo_concurrency.lockutils [req-620c51b5-3494-40f3-9caf-3bd2d7d84ab2 req-859c440e-5af3-4695-b2e1-0960f14de154 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-74afc852-d448-4b15-b808-f3949eeee83c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.655 183181 DEBUG oslo_concurrency.lockutils [req-620c51b5-3494-40f3-9caf-3bd2d7d84ab2 req-859c440e-5af3-4695-b2e1-0960f14de154 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-74afc852-d448-4b15-b808-f3949eeee83c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.656 183181 DEBUG nova.network.neutron [req-620c51b5-3494-40f3-9caf-3bd2d7d84ab2 req-859c440e-5af3-4695-b2e1-0960f14de154 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Refreshing network info cache for port 3c2d86fb-aa08-415f-878e-b675f3b8a580 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:04:02 compute-0 nova_compute[183177]: 2026-01-26 20:04:02.965 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:03 compute-0 nova_compute[183177]: 2026-01-26 20:04:03.066 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "refresh_cache-74afc852-d448-4b15-b808-f3949eeee83c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:04:03 compute-0 nova_compute[183177]: 2026-01-26 20:04:03.164 183181 WARNING neutronclient.v2_0.client [req-620c51b5-3494-40f3-9caf-3bd2d7d84ab2 req-859c440e-5af3-4695-b2e1-0960f14de154 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:04:03 compute-0 nova_compute[183177]: 2026-01-26 20:04:03.276 183181 DEBUG nova.network.neutron [req-620c51b5-3494-40f3-9caf-3bd2d7d84ab2 req-859c440e-5af3-4695-b2e1-0960f14de154 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:04:03 compute-0 nova_compute[183177]: 2026-01-26 20:04:03.451 183181 DEBUG nova.network.neutron [req-620c51b5-3494-40f3-9caf-3bd2d7d84ab2 req-859c440e-5af3-4695-b2e1-0960f14de154 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:04:03 compute-0 nova_compute[183177]: 2026-01-26 20:04:03.960 183181 DEBUG oslo_concurrency.lockutils [req-620c51b5-3494-40f3-9caf-3bd2d7d84ab2 req-859c440e-5af3-4695-b2e1-0960f14de154 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-74afc852-d448-4b15-b808-f3949eeee83c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:04:03 compute-0 nova_compute[183177]: 2026-01-26 20:04:03.961 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquired lock "refresh_cache-74afc852-d448-4b15-b808-f3949eeee83c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:04:03 compute-0 nova_compute[183177]: 2026-01-26 20:04:03.961 183181 DEBUG nova.network.neutron [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 20:04:04 compute-0 nova_compute[183177]: 2026-01-26 20:04:04.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:04:04 compute-0 podman[213985]: 2026-01-26 20:04:04.333892388 +0000 UTC m=+0.061046014 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 26 20:04:04 compute-0 podman[213984]: 2026-01-26 20:04:04.344947985 +0000 UTC m=+0.079131740 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Jan 26 20:04:04 compute-0 podman[213983]: 2026-01-26 20:04:04.357930856 +0000 UTC m=+0.097695651 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120)
Jan 26 20:04:05 compute-0 nova_compute[183177]: 2026-01-26 20:04:05.305 183181 DEBUG nova.network.neutron [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:04:05 compute-0 nova_compute[183177]: 2026-01-26 20:04:05.523 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:06 compute-0 nova_compute[183177]: 2026-01-26 20:04:06.264 183181 WARNING neutronclient.v2_0.client [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:04:06 compute-0 nova_compute[183177]: 2026-01-26 20:04:06.558 183181 DEBUG nova.network.neutron [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Updating instance_info_cache with network_info: [{"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.066 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Releasing lock "refresh_cache-74afc852-d448-4b15-b808-f3949eeee83c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.067 183181 DEBUG nova.compute.manager [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Instance network_info: |[{"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.072 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Start _get_guest_xml network_info=[{"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.079 183181 WARNING nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.081 183181 DEBUG nova.virt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1526182368', uuid='74afc852-d448-4b15-b808-f3949eeee83c'), owner=OwnerMeta(userid='b3d5258d30ef4be39230c019f11bed8f', username='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin', projectid='8d30bc5631f24a6799364d53cb4e9465', projectname='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769457847.0814137) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.088 183181 DEBUG nova.virt.libvirt.host [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.089 183181 DEBUG nova.virt.libvirt.host [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.093 183181 DEBUG nova.virt.libvirt.host [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.094 183181 DEBUG nova.virt.libvirt.host [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.095 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.095 183181 DEBUG nova.virt.hardware [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.096 183181 DEBUG nova.virt.hardware [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.096 183181 DEBUG nova.virt.hardware [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.096 183181 DEBUG nova.virt.hardware [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.096 183181 DEBUG nova.virt.hardware [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.096 183181 DEBUG nova.virt.hardware [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.097 183181 DEBUG nova.virt.hardware [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.097 183181 DEBUG nova.virt.hardware [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.097 183181 DEBUG nova.virt.hardware [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.097 183181 DEBUG nova.virt.hardware [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.097 183181 DEBUG nova.virt.hardware [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.101 183181 DEBUG nova.virt.libvirt.vif [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:03:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1526182368',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1526182',id=26,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-1diwfc47',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:04:00Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=74afc852-d448-4b15-b808-f3949eeee83c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.102 183181 DEBUG nova.network.os_vif_util [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converting VIF {"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.102 183181 DEBUG nova.network.os_vif_util [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:c9:7c,bridge_name='br-int',has_traffic_filtering=True,id=3c2d86fb-aa08-415f-878e-b675f3b8a580,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c2d86fb-aa') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.103 183181 DEBUG nova.objects.instance [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lazy-loading 'pci_devices' on Instance uuid 74afc852-d448-4b15-b808-f3949eeee83c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.610 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] End _get_guest_xml xml=<domain type="kvm">
Jan 26 20:04:07 compute-0 nova_compute[183177]:   <uuid>74afc852-d448-4b15-b808-f3949eeee83c</uuid>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   <name>instance-0000001a</name>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-1526182368</nova:name>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:04:07</nova:creationTime>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:04:07 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:04:07 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         <nova:port uuid="3c2d86fb-aa08-415f-878e-b675f3b8a580">
Jan 26 20:04:07 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <system>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <entry name="serial">74afc852-d448-4b15-b808-f3949eeee83c</entry>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <entry name="uuid">74afc852-d448-4b15-b808-f3949eeee83c</entry>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     </system>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   <os>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   </os>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   <features>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   </features>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk.config"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:26:c9:7c"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <target dev="tap3c2d86fb-aa"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     </interface>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/console.log" append="off"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <video>
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     </video>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:04:07 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:04:07 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:04:07 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:04:07 compute-0 nova_compute[183177]: </domain>
Jan 26 20:04:07 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.612 183181 DEBUG nova.compute.manager [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Preparing to wait for external event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.612 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.612 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.613 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.613 183181 DEBUG nova.virt.libvirt.vif [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:03:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1526182368',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1526182',id=26,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-1diwfc47',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:04:00Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=74afc852-d448-4b15-b808-f3949eeee83c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.614 183181 DEBUG nova.network.os_vif_util [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converting VIF {"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.614 183181 DEBUG nova.network.os_vif_util [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:c9:7c,bridge_name='br-int',has_traffic_filtering=True,id=3c2d86fb-aa08-415f-878e-b675f3b8a580,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c2d86fb-aa') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.615 183181 DEBUG os_vif [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:c9:7c,bridge_name='br-int',has_traffic_filtering=True,id=3c2d86fb-aa08-415f-878e-b675f3b8a580,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c2d86fb-aa') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.615 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.615 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.616 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.616 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.617 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '19799fed-73d1-5273-984d-6f04b3cfc878', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.618 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.619 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.622 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.622 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c2d86fb-aa, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.622 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3c2d86fb-aa, col_values=(('qos', UUID('dc94bc77-69b8-44a9-bd5f-a3deee00c652')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.622 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3c2d86fb-aa, col_values=(('external_ids', {'iface-id': '3c2d86fb-aa08-415f-878e-b675f3b8a580', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:c9:7c', 'vm-uuid': '74afc852-d448-4b15-b808-f3949eeee83c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.623 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:07 compute-0 NetworkManager[55489]: <info>  [1769457847.6246] manager: (tap3c2d86fb-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.625 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.629 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.629 183181 INFO os_vif [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:c9:7c,bridge_name='br-int',has_traffic_filtering=True,id=3c2d86fb-aa08-415f-878e-b675f3b8a580,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c2d86fb-aa')
Jan 26 20:04:07 compute-0 nova_compute[183177]: 2026-01-26 20:04:07.967 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:09 compute-0 nova_compute[183177]: 2026-01-26 20:04:09.185 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:04:09 compute-0 nova_compute[183177]: 2026-01-26 20:04:09.186 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:04:09 compute-0 nova_compute[183177]: 2026-01-26 20:04:09.187 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] No VIF found with MAC fa:16:3e:26:c9:7c, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 20:04:09 compute-0 nova_compute[183177]: 2026-01-26 20:04:09.187 183181 INFO nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Using config drive
Jan 26 20:04:09 compute-0 nova_compute[183177]: 2026-01-26 20:04:09.702 183181 WARNING neutronclient.v2_0.client [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.293 183181 INFO nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Creating config drive at /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk.config
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.303 183181 DEBUG oslo_concurrency.processutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp18k3_irv execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.440 183181 DEBUG oslo_concurrency.processutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp18k3_irv" returned: 0 in 0.137s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:10 compute-0 NetworkManager[55489]: <info>  [1769457850.5359] manager: (tap3c2d86fb-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Jan 26 20:04:10 compute-0 kernel: tap3c2d86fb-aa: entered promiscuous mode
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.537 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:10 compute-0 ovn_controller[95396]: 2026-01-26T20:04:10Z|00198|binding|INFO|Claiming lport 3c2d86fb-aa08-415f-878e-b675f3b8a580 for this chassis.
Jan 26 20:04:10 compute-0 ovn_controller[95396]: 2026-01-26T20:04:10Z|00199|binding|INFO|3c2d86fb-aa08-415f-878e-b675f3b8a580: Claiming fa:16:3e:26:c9:7c 10.100.0.13
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.541 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.561 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:c9:7c 10.100.0.13'], port_security=['fa:16:3e:26:c9:7c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '74afc852-d448-4b15-b808-f3949eeee83c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d30bc5631f24a6799364d53cb4e9465', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2edde6da-001c-4935-bbf6-81cd72a69f91', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8598e857-e8d7-4ecf-97dd-17cf2e766e3a, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=3c2d86fb-aa08-415f-878e-b675f3b8a580) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.563 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 3c2d86fb-aa08-415f-878e-b675f3b8a580 in datapath d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c bound to our chassis
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.564 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.579 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8839e3be-d59d-4c48-ade1-b72796a5a56c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.580 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd9f030a0-21 in ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 20:04:10 compute-0 systemd-udevd[214067]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.582 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd9f030a0-20 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.582 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[21d47b1e-8e83-43db-a486-bad1d1f55120]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.583 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[13bd117c-7446-48dc-8e42-c29aa0380c87]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 systemd-machined[154465]: New machine qemu-19-instance-0000001a.
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.604 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[25b820d5-22ce-4d2c-97e2-2916d2a94200]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 NetworkManager[55489]: <info>  [1769457850.6072] device (tap3c2d86fb-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 20:04:10 compute-0 NetworkManager[55489]: <info>  [1769457850.6091] device (tap3c2d86fb-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 20:04:10 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-0000001a.
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.616 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.624 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f532fe88-d0d4-4bb7-a927-fda2f88da94f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 ovn_controller[95396]: 2026-01-26T20:04:10Z|00200|binding|INFO|Setting lport 3c2d86fb-aa08-415f-878e-b675f3b8a580 ovn-installed in OVS
Jan 26 20:04:10 compute-0 ovn_controller[95396]: 2026-01-26T20:04:10Z|00201|binding|INFO|Setting lport 3c2d86fb-aa08-415f-878e-b675f3b8a580 up in Southbound
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.631 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.657 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[149b10c1-5f90-40f4-b73e-02e1fa0bd325]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.663 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[fe670a6e-1ad6-4131-b9f1-c8a8e9f92355]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 NetworkManager[55489]: <info>  [1769457850.6646] manager: (tapd9f030a0-20): new Veth device (/org/freedesktop/NetworkManager/Devices/82)
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.667 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.668 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.669 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.697 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[7d37b73d-c6de-4e96-8a4d-1ac955bfddbd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.699 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[c589483e-bd03-4b06-adf6-0f3c60c284d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 NetworkManager[55489]: <info>  [1769457850.7239] device (tapd9f030a0-20): carrier: link connected
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.737 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[8f805f20-fad6-4e39-8dc2-f693cd28fea9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.770 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a529a78a-8431-48f9-bc6b-20201121a888]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f030a0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:c2:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530961, 'reachable_time': 18138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214100, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.796 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[abcbd06f-dd8f-4faf-b702-272777f50674]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:c2e0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 530961, 'tstamp': 530961}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214101, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.828 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2754a9fd-5f15-42f8-80f9-7c483f554590]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f030a0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:c2:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530961, 'reachable_time': 18138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214102, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.877 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[ea124f60-8e21-4581-9124-11013401639e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.961 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[ad9ddc7e-de84-4cdf-9268-a9be269c61f7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.963 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f030a0-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.964 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.964 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9f030a0-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.984 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:10 compute-0 NetworkManager[55489]: <info>  [1769457850.9848] manager: (tapd9f030a0-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Jan 26 20:04:10 compute-0 kernel: tapd9f030a0-20: entered promiscuous mode
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.987 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:10.988 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9f030a0-20, col_values=(('external_ids', {'iface-id': '63338c40-169b-4962-a6c8-8ca20b375080'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:10 compute-0 ovn_controller[95396]: 2026-01-26T20:04:10Z|00202|binding|INFO|Releasing lport 63338c40-169b-4962-a6c8-8ca20b375080 from this chassis (sb_readonly=0)
Jan 26 20:04:10 compute-0 nova_compute[183177]: 2026-01-26 20:04:10.989 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.013 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:11.015 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1c83e9d5-0fc6-40b9-87ef-fa56184a0a8f]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:11.017 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:11.017 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:11.017 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:11.017 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:11.018 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[5c74a989-343a-4d3d-9989-6ed34eac98f5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:11.019 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:11.019 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe46ea1-8b49-4db7-b071-b0df2002f296]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:11.020 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: global
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 20:04:11 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:11.021 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'env', 'PROCESS_TAG=haproxy-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.455 183181 DEBUG nova.compute.manager [req-0cd54a04-3dd7-4d8d-bbe4-5d02a4bd5fc6 req-d33795d7-c593-4434-942b-1860bbf66fdf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.455 183181 DEBUG oslo_concurrency.lockutils [req-0cd54a04-3dd7-4d8d-bbe4-5d02a4bd5fc6 req-d33795d7-c593-4434-942b-1860bbf66fdf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.456 183181 DEBUG oslo_concurrency.lockutils [req-0cd54a04-3dd7-4d8d-bbe4-5d02a4bd5fc6 req-d33795d7-c593-4434-942b-1860bbf66fdf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.456 183181 DEBUG oslo_concurrency.lockutils [req-0cd54a04-3dd7-4d8d-bbe4-5d02a4bd5fc6 req-d33795d7-c593-4434-942b-1860bbf66fdf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.456 183181 DEBUG nova.compute.manager [req-0cd54a04-3dd7-4d8d-bbe4-5d02a4bd5fc6 req-d33795d7-c593-4434-942b-1860bbf66fdf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Processing event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.457 183181 DEBUG nova.compute.manager [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.461 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.465 183181 INFO nova.virt.libvirt.driver [-] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Instance spawned successfully.
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.465 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 20:04:11 compute-0 podman[214141]: 2026-01-26 20:04:11.493911451 +0000 UTC m=+0.075372520 container create 5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Jan 26 20:04:11 compute-0 systemd[1]: Started libpod-conmon-5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0.scope.
Jan 26 20:04:11 compute-0 podman[214141]: 2026-01-26 20:04:11.460607884 +0000 UTC m=+0.042068973 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 20:04:11 compute-0 systemd[1]: Started libcrun container.
Jan 26 20:04:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64f6573c5eb3cb04ca1da582602a0215d076a1dace4febd82cfd79a6248d8c3b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 20:04:11 compute-0 podman[214141]: 2026-01-26 20:04:11.618990737 +0000 UTC m=+0.200451816 container init 5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120)
Jan 26 20:04:11 compute-0 podman[214154]: 2026-01-26 20:04:11.62610788 +0000 UTC m=+0.078289439 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:04:11 compute-0 podman[214141]: 2026-01-26 20:04:11.628810071 +0000 UTC m=+0.210271140 container start 5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:04:11 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[214157]: [NOTICE]   (214182) : New worker (214184) forked
Jan 26 20:04:11 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[214157]: [NOTICE]   (214182) : Loading success.
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.743 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.802 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.803 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.854 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.983 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.985 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.986 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.986 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.987 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:04:11 compute-0 nova_compute[183177]: 2026-01-26 20:04:11.987 183181 DEBUG nova.virt.libvirt.driver [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:04:12 compute-0 nova_compute[183177]: 2026-01-26 20:04:12.057 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:04:12 compute-0 nova_compute[183177]: 2026-01-26 20:04:12.060 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:12 compute-0 nova_compute[183177]: 2026-01-26 20:04:12.101 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:12 compute-0 nova_compute[183177]: 2026-01-26 20:04:12.102 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5715MB free_disk=73.09737777709961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:04:12 compute-0 nova_compute[183177]: 2026-01-26 20:04:12.102 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:12 compute-0 nova_compute[183177]: 2026-01-26 20:04:12.102 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:12 compute-0 nova_compute[183177]: 2026-01-26 20:04:12.498 183181 INFO nova.compute.manager [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Took 10.79 seconds to spawn the instance on the hypervisor.
Jan 26 20:04:12 compute-0 nova_compute[183177]: 2026-01-26 20:04:12.499 183181 DEBUG nova.compute.manager [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 20:04:12 compute-0 nova_compute[183177]: 2026-01-26 20:04:12.624 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:12 compute-0 nova_compute[183177]: 2026-01-26 20:04:12.969 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:13 compute-0 nova_compute[183177]: 2026-01-26 20:04:13.040 183181 INFO nova.compute.manager [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Took 16.08 seconds to build instance.
Jan 26 20:04:13 compute-0 nova_compute[183177]: 2026-01-26 20:04:13.207 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 74afc852-d448-4b15-b808-f3949eeee83c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 20:04:13 compute-0 nova_compute[183177]: 2026-01-26 20:04:13.208 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:04:13 compute-0 nova_compute[183177]: 2026-01-26 20:04:13.208 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:04:12 up  1:28,  0 user,  load average: 0.12, 0.24, 0.27\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_8d30bc5631f24a6799364d53cb4e9465': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:04:13 compute-0 nova_compute[183177]: 2026-01-26 20:04:13.310 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:04:13 compute-0 nova_compute[183177]: 2026-01-26 20:04:13.536 183181 DEBUG nova.compute.manager [req-9cc60f8d-28a5-4dfd-9c70-983014c268b7 req-7201f59c-f370-46c3-9963-d5073fe0e36a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:04:13 compute-0 nova_compute[183177]: 2026-01-26 20:04:13.537 183181 DEBUG oslo_concurrency.lockutils [req-9cc60f8d-28a5-4dfd-9c70-983014c268b7 req-7201f59c-f370-46c3-9963-d5073fe0e36a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:13 compute-0 nova_compute[183177]: 2026-01-26 20:04:13.537 183181 DEBUG oslo_concurrency.lockutils [req-9cc60f8d-28a5-4dfd-9c70-983014c268b7 req-7201f59c-f370-46c3-9963-d5073fe0e36a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:13 compute-0 nova_compute[183177]: 2026-01-26 20:04:13.537 183181 DEBUG oslo_concurrency.lockutils [req-9cc60f8d-28a5-4dfd-9c70-983014c268b7 req-7201f59c-f370-46c3-9963-d5073fe0e36a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:13 compute-0 nova_compute[183177]: 2026-01-26 20:04:13.538 183181 DEBUG nova.compute.manager [req-9cc60f8d-28a5-4dfd-9c70-983014c268b7 req-7201f59c-f370-46c3-9963-d5073fe0e36a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] No waiting events found dispatching network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:04:13 compute-0 nova_compute[183177]: 2026-01-26 20:04:13.538 183181 WARNING nova.compute.manager [req-9cc60f8d-28a5-4dfd-9c70-983014c268b7 req-7201f59c-f370-46c3-9963-d5073fe0e36a 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received unexpected event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 for instance with vm_state active and task_state None.
Jan 26 20:04:13 compute-0 nova_compute[183177]: 2026-01-26 20:04:13.546 183181 DEBUG oslo_concurrency.lockutils [None req-c2c6fdb3-f90f-44ec-a6b1-9ee091def489 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.602s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:13 compute-0 nova_compute[183177]: 2026-01-26 20:04:13.817 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:04:14 compute-0 nova_compute[183177]: 2026-01-26 20:04:14.329 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:04:14 compute-0 nova_compute[183177]: 2026-01-26 20:04:14.330 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.228s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:15 compute-0 nova_compute[183177]: 2026-01-26 20:04:15.332 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:04:15 compute-0 nova_compute[183177]: 2026-01-26 20:04:15.332 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:04:15 compute-0 nova_compute[183177]: 2026-01-26 20:04:15.333 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:04:16 compute-0 nova_compute[183177]: 2026-01-26 20:04:16.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:04:17 compute-0 nova_compute[183177]: 2026-01-26 20:04:17.664 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:17 compute-0 nova_compute[183177]: 2026-01-26 20:04:17.973 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:18 compute-0 nova_compute[183177]: 2026-01-26 20:04:18.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:04:18 compute-0 sshd-session[214200]: Connection closed by authenticating user root 188.166.116.149 port 55168 [preauth]
Jan 26 20:04:19 compute-0 nova_compute[183177]: 2026-01-26 20:04:19.219 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:19 compute-0 nova_compute[183177]: 2026-01-26 20:04:19.220 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:19 compute-0 nova_compute[183177]: 2026-01-26 20:04:19.727 183181 DEBUG nova.compute.manager [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 20:04:20 compute-0 nova_compute[183177]: 2026-01-26 20:04:20.292 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:20 compute-0 nova_compute[183177]: 2026-01-26 20:04:20.293 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:20 compute-0 nova_compute[183177]: 2026-01-26 20:04:20.303 183181 DEBUG nova.virt.hardware [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 20:04:20 compute-0 nova_compute[183177]: 2026-01-26 20:04:20.304 183181 INFO nova.compute.claims [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Claim successful on node compute-0.ctlplane.example.com
Jan 26 20:04:21 compute-0 nova_compute[183177]: 2026-01-26 20:04:21.382 183181 DEBUG nova.compute.provider_tree [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:04:21 compute-0 nova_compute[183177]: 2026-01-26 20:04:21.929 183181 DEBUG nova.scheduler.client.report [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:04:22 compute-0 nova_compute[183177]: 2026-01-26 20:04:22.442 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.149s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:22 compute-0 nova_compute[183177]: 2026-01-26 20:04:22.444 183181 DEBUG nova.compute.manager [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 20:04:22 compute-0 nova_compute[183177]: 2026-01-26 20:04:22.670 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:22 compute-0 nova_compute[183177]: 2026-01-26 20:04:22.960 183181 DEBUG nova.compute.manager [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 20:04:22 compute-0 nova_compute[183177]: 2026-01-26 20:04:22.960 183181 DEBUG nova.network.neutron [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 20:04:22 compute-0 nova_compute[183177]: 2026-01-26 20:04:22.961 183181 WARNING neutronclient.v2_0.client [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:04:22 compute-0 nova_compute[183177]: 2026-01-26 20:04:22.961 183181 WARNING neutronclient.v2_0.client [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:04:22 compute-0 nova_compute[183177]: 2026-01-26 20:04:22.975 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:23 compute-0 ovn_controller[95396]: 2026-01-26T20:04:23Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:26:c9:7c 10.100.0.13
Jan 26 20:04:23 compute-0 ovn_controller[95396]: 2026-01-26T20:04:23Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:26:c9:7c 10.100.0.13
Jan 26 20:04:23 compute-0 nova_compute[183177]: 2026-01-26 20:04:23.471 183181 INFO nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 20:04:23 compute-0 nova_compute[183177]: 2026-01-26 20:04:23.985 183181 DEBUG nova.compute.manager [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 20:04:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:24.091 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:24.091 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:24.093 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:24 compute-0 nova_compute[183177]: 2026-01-26 20:04:24.148 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:04:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:24.294 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:04:24 compute-0 nova_compute[183177]: 2026-01-26 20:04:24.294 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:24.295 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:04:24 compute-0 nova_compute[183177]: 2026-01-26 20:04:24.482 183181 DEBUG nova.network.neutron [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Successfully created port: 2220e11e-047f-472c-b3b5-fc824d195b31 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.010 183181 DEBUG nova.compute.manager [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.012 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.012 183181 INFO nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Creating image(s)
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.013 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.014 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.015 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.016 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.022 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.024 183181 DEBUG oslo_concurrency.processutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.121 183181 DEBUG oslo_concurrency.processutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.122 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.123 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.124 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.130 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.130 183181 DEBUG oslo_concurrency.processutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.198 183181 DEBUG oslo_concurrency.processutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.199 183181 DEBUG oslo_concurrency.processutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.240 183181 DEBUG oslo_concurrency.processutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.241 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.242 183181 DEBUG oslo_concurrency.processutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.307 183181 DEBUG oslo_concurrency.processutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.308 183181 DEBUG nova.virt.disk.api [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Checking if we can resize image /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.309 183181 DEBUG oslo_concurrency.processutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.366 183181 DEBUG oslo_concurrency.processutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.368 183181 DEBUG nova.virt.disk.api [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Cannot resize image /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.368 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.369 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Ensure instance console log exists: /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.370 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.370 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.370 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.448 183181 DEBUG nova.network.neutron [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Successfully updated port: 2220e11e-047f-472c-b3b5-fc824d195b31 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.511 183181 DEBUG nova.compute.manager [req-81f30cc3-31ba-41fd-8bd0-298ddd482134 req-0f887737-d75d-425d-9666-136da49357d4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-changed-2220e11e-047f-472c-b3b5-fc824d195b31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.512 183181 DEBUG nova.compute.manager [req-81f30cc3-31ba-41fd-8bd0-298ddd482134 req-0f887737-d75d-425d-9666-136da49357d4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Refreshing instance network info cache due to event network-changed-2220e11e-047f-472c-b3b5-fc824d195b31. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.512 183181 DEBUG oslo_concurrency.lockutils [req-81f30cc3-31ba-41fd-8bd0-298ddd482134 req-0f887737-d75d-425d-9666-136da49357d4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-6adcbc8f-e643-4839-981f-0ccabe843c29" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.513 183181 DEBUG oslo_concurrency.lockutils [req-81f30cc3-31ba-41fd-8bd0-298ddd482134 req-0f887737-d75d-425d-9666-136da49357d4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-6adcbc8f-e643-4839-981f-0ccabe843c29" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.513 183181 DEBUG nova.network.neutron [req-81f30cc3-31ba-41fd-8bd0-298ddd482134 req-0f887737-d75d-425d-9666-136da49357d4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Refreshing network info cache for port 2220e11e-047f-472c-b3b5-fc824d195b31 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:04:25 compute-0 nova_compute[183177]: 2026-01-26 20:04:25.956 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "refresh_cache-6adcbc8f-e643-4839-981f-0ccabe843c29" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:04:26 compute-0 nova_compute[183177]: 2026-01-26 20:04:26.041 183181 WARNING neutronclient.v2_0.client [req-81f30cc3-31ba-41fd-8bd0-298ddd482134 req-0f887737-d75d-425d-9666-136da49357d4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:04:26 compute-0 nova_compute[183177]: 2026-01-26 20:04:26.306 183181 DEBUG nova.network.neutron [req-81f30cc3-31ba-41fd-8bd0-298ddd482134 req-0f887737-d75d-425d-9666-136da49357d4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:04:26 compute-0 nova_compute[183177]: 2026-01-26 20:04:26.477 183181 DEBUG nova.network.neutron [req-81f30cc3-31ba-41fd-8bd0-298ddd482134 req-0f887737-d75d-425d-9666-136da49357d4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:04:26 compute-0 nova_compute[183177]: 2026-01-26 20:04:26.984 183181 DEBUG oslo_concurrency.lockutils [req-81f30cc3-31ba-41fd-8bd0-298ddd482134 req-0f887737-d75d-425d-9666-136da49357d4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-6adcbc8f-e643-4839-981f-0ccabe843c29" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:04:26 compute-0 nova_compute[183177]: 2026-01-26 20:04:26.984 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquired lock "refresh_cache-6adcbc8f-e643-4839-981f-0ccabe843c29" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:04:26 compute-0 nova_compute[183177]: 2026-01-26 20:04:26.985 183181 DEBUG nova.network.neutron [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 20:04:27 compute-0 nova_compute[183177]: 2026-01-26 20:04:27.711 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:27 compute-0 nova_compute[183177]: 2026-01-26 20:04:27.977 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:28 compute-0 nova_compute[183177]: 2026-01-26 20:04:28.282 183181 DEBUG nova.network.neutron [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:04:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:28.298 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:29 compute-0 nova_compute[183177]: 2026-01-26 20:04:29.266 183181 WARNING neutronclient.v2_0.client [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:04:29 compute-0 nova_compute[183177]: 2026-01-26 20:04:29.575 183181 DEBUG nova.network.neutron [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Updating instance_info_cache with network_info: [{"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:04:29 compute-0 podman[192499]: time="2026-01-26T20:04:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:04:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:04:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:04:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:04:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.092 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Releasing lock "refresh_cache-6adcbc8f-e643-4839-981f-0ccabe843c29" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.092 183181 DEBUG nova.compute.manager [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Instance network_info: |[{"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.095 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Start _get_guest_xml network_info=[{"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.100 183181 WARNING nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.102 183181 DEBUG nova.virt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadStabilizationStrategy-server-2107227729', uuid='6adcbc8f-e643-4839-981f-0ccabe843c29'), owner=OwnerMeta(userid='b3d5258d30ef4be39230c019f11bed8f', username='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin', projectid='8d30bc5631f24a6799364d53cb4e9465', projectname='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769457870.102012) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.106 183181 DEBUG nova.virt.libvirt.host [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.106 183181 DEBUG nova.virt.libvirt.host [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.110 183181 DEBUG nova.virt.libvirt.host [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.111 183181 DEBUG nova.virt.libvirt.host [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.112 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.112 183181 DEBUG nova.virt.hardware [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.113 183181 DEBUG nova.virt.hardware [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.113 183181 DEBUG nova.virt.hardware [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.113 183181 DEBUG nova.virt.hardware [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.114 183181 DEBUG nova.virt.hardware [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.114 183181 DEBUG nova.virt.hardware [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.114 183181 DEBUG nova.virt.hardware [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.115 183181 DEBUG nova.virt.hardware [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.115 183181 DEBUG nova.virt.hardware [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.115 183181 DEBUG nova.virt.hardware [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.116 183181 DEBUG nova.virt.hardware [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.120 183181 DEBUG nova.virt.libvirt.vif [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:04:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-2107227729',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-2107227',id=27,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-w07hfiyx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:04:24Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=6adcbc8f-e643-4839-981f-0ccabe843c29,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.120 183181 DEBUG nova.network.os_vif_util [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converting VIF {"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.121 183181 DEBUG nova.network.os_vif_util [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:3d:9e,bridge_name='br-int',has_traffic_filtering=True,id=2220e11e-047f-472c-b3b5-fc824d195b31,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2220e11e-04') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.123 183181 DEBUG nova.objects.instance [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6adcbc8f-e643-4839-981f-0ccabe843c29 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.634 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] End _get_guest_xml xml=<domain type="kvm">
Jan 26 20:04:30 compute-0 nova_compute[183177]:   <uuid>6adcbc8f-e643-4839-981f-0ccabe843c29</uuid>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   <name>instance-0000001b</name>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-2107227729</nova:name>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:04:30</nova:creationTime>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:04:30 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:04:30 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         <nova:port uuid="2220e11e-047f-472c-b3b5-fc824d195b31">
Jan 26 20:04:30 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <system>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <entry name="serial">6adcbc8f-e643-4839-981f-0ccabe843c29</entry>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <entry name="uuid">6adcbc8f-e643-4839-981f-0ccabe843c29</entry>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     </system>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   <os>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   </os>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   <features>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   </features>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk.config"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:ce:3d:9e"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <target dev="tap2220e11e-04"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     </interface>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/console.log" append="off"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <video>
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     </video>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:04:30 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:04:30 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:04:30 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:04:30 compute-0 nova_compute[183177]: </domain>
Jan 26 20:04:30 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.635 183181 DEBUG nova.compute.manager [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Preparing to wait for external event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.636 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.636 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.636 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.637 183181 DEBUG nova.virt.libvirt.vif [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:04:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-2107227729',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-2107227',id=27,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-w07hfiyx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:04:24Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=6adcbc8f-e643-4839-981f-0ccabe843c29,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.637 183181 DEBUG nova.network.os_vif_util [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converting VIF {"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.638 183181 DEBUG nova.network.os_vif_util [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:3d:9e,bridge_name='br-int',has_traffic_filtering=True,id=2220e11e-047f-472c-b3b5-fc824d195b31,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2220e11e-04') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.638 183181 DEBUG os_vif [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:3d:9e,bridge_name='br-int',has_traffic_filtering=True,id=2220e11e-047f-472c-b3b5-fc824d195b31,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2220e11e-04') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.638 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.639 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.639 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.640 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.640 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'f91b44d7-26fb-5d2c-94f8-56db3616f7ff', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.641 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.641 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.642 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.643 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.645 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.645 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2220e11e-04, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.645 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2220e11e-04, col_values=(('qos', UUID('67a698de-7be0-4b5a-9022-c275ded39d1d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.646 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2220e11e-04, col_values=(('external_ids', {'iface-id': '2220e11e-047f-472c-b3b5-fc824d195b31', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:3d:9e', 'vm-uuid': '6adcbc8f-e643-4839-981f-0ccabe843c29'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:30 compute-0 NetworkManager[55489]: <info>  [1769457870.6479] manager: (tap2220e11e-04): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.647 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.650 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.654 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:30 compute-0 nova_compute[183177]: 2026-01-26 20:04:30.654 183181 INFO os_vif [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:3d:9e,bridge_name='br-int',has_traffic_filtering=True,id=2220e11e-047f-472c-b3b5-fc824d195b31,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2220e11e-04')
Jan 26 20:04:31 compute-0 openstack_network_exporter[195363]: ERROR   20:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:04:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:04:31 compute-0 openstack_network_exporter[195363]: ERROR   20:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:04:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:04:32 compute-0 nova_compute[183177]: 2026-01-26 20:04:32.202 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:04:32 compute-0 nova_compute[183177]: 2026-01-26 20:04:32.202 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:04:32 compute-0 nova_compute[183177]: 2026-01-26 20:04:32.202 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] No VIF found with MAC fa:16:3e:ce:3d:9e, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 20:04:32 compute-0 nova_compute[183177]: 2026-01-26 20:04:32.203 183181 INFO nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Using config drive
Jan 26 20:04:32 compute-0 nova_compute[183177]: 2026-01-26 20:04:32.712 183181 WARNING neutronclient.v2_0.client [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:04:32 compute-0 nova_compute[183177]: 2026-01-26 20:04:32.978 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:33 compute-0 nova_compute[183177]: 2026-01-26 20:04:33.428 183181 INFO nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Creating config drive at /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk.config
Jan 26 20:04:33 compute-0 nova_compute[183177]: 2026-01-26 20:04:33.441 183181 DEBUG oslo_concurrency.processutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpjgva8as5 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:04:33 compute-0 nova_compute[183177]: 2026-01-26 20:04:33.592 183181 DEBUG oslo_concurrency.processutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpjgva8as5" returned: 0 in 0.152s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:04:33 compute-0 kernel: tap2220e11e-04: entered promiscuous mode
Jan 26 20:04:33 compute-0 NetworkManager[55489]: <info>  [1769457873.6829] manager: (tap2220e11e-04): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Jan 26 20:04:33 compute-0 ovn_controller[95396]: 2026-01-26T20:04:33Z|00203|binding|INFO|Claiming lport 2220e11e-047f-472c-b3b5-fc824d195b31 for this chassis.
Jan 26 20:04:33 compute-0 ovn_controller[95396]: 2026-01-26T20:04:33Z|00204|binding|INFO|2220e11e-047f-472c-b3b5-fc824d195b31: Claiming fa:16:3e:ce:3d:9e 10.100.0.8
Jan 26 20:04:33 compute-0 nova_compute[183177]: 2026-01-26 20:04:33.684 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.692 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:3d:9e 10.100.0.8'], port_security=['fa:16:3e:ce:3d:9e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '6adcbc8f-e643-4839-981f-0ccabe843c29', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d30bc5631f24a6799364d53cb4e9465', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2edde6da-001c-4935-bbf6-81cd72a69f91', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8598e857-e8d7-4ecf-97dd-17cf2e766e3a, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=2220e11e-047f-472c-b3b5-fc824d195b31) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.693 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 2220e11e-047f-472c-b3b5-fc824d195b31 in datapath d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c bound to our chassis
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.696 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c
Jan 26 20:04:33 compute-0 ovn_controller[95396]: 2026-01-26T20:04:33Z|00205|binding|INFO|Setting lport 2220e11e-047f-472c-b3b5-fc824d195b31 ovn-installed in OVS
Jan 26 20:04:33 compute-0 ovn_controller[95396]: 2026-01-26T20:04:33Z|00206|binding|INFO|Setting lport 2220e11e-047f-472c-b3b5-fc824d195b31 up in Southbound
Jan 26 20:04:33 compute-0 nova_compute[183177]: 2026-01-26 20:04:33.702 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:33 compute-0 nova_compute[183177]: 2026-01-26 20:04:33.707 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.729 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[84018b02-117e-44cc-a325-f2da768de2a0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:33 compute-0 systemd-machined[154465]: New machine qemu-20-instance-0000001b.
Jan 26 20:04:33 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000001b.
Jan 26 20:04:33 compute-0 systemd-udevd[214267]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.780 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[eac15178-24e9-490b-b064-cbe36ce3bc1f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.785 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[b69e0602-31fb-4631-9612-f786831b2ca4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:33 compute-0 NetworkManager[55489]: <info>  [1769457873.7955] device (tap2220e11e-04): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 20:04:33 compute-0 NetworkManager[55489]: <info>  [1769457873.7977] device (tap2220e11e-04): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.838 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[5936a3d1-f172-41e6-bc34-270c4455e6f1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.868 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[4d53dafe-a690-40ef-906e-0b4857bc1449]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f030a0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:c2:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530961, 'reachable_time': 18138, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214277, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.888 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8102d12d-4e48-4120-a295-9aab96f689fc]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd9f030a0-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 530979, 'tstamp': 530979}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214279, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd9f030a0-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 530983, 'tstamp': 530983}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214279, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.890 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f030a0-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:33 compute-0 nova_compute[183177]: 2026-01-26 20:04:33.892 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:33 compute-0 nova_compute[183177]: 2026-01-26 20:04:33.893 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.894 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9f030a0-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.894 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.895 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9f030a0-20, col_values=(('external_ids', {'iface-id': '63338c40-169b-4962-a6c8-8ca20b375080'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.895 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:04:33 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:04:33.897 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[7308bc21-7113-46ff-ae2c-b1c1a1ae146d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:04:34 compute-0 nova_compute[183177]: 2026-01-26 20:04:34.564 183181 DEBUG nova.compute.manager [req-f8583cf2-467d-4b12-b442-d3bf232b5fd3 req-678f32a1-55b5-4d2a-a0fb-e652a88951ef 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:04:34 compute-0 nova_compute[183177]: 2026-01-26 20:04:34.565 183181 DEBUG oslo_concurrency.lockutils [req-f8583cf2-467d-4b12-b442-d3bf232b5fd3 req-678f32a1-55b5-4d2a-a0fb-e652a88951ef 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:34 compute-0 nova_compute[183177]: 2026-01-26 20:04:34.565 183181 DEBUG oslo_concurrency.lockutils [req-f8583cf2-467d-4b12-b442-d3bf232b5fd3 req-678f32a1-55b5-4d2a-a0fb-e652a88951ef 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:34 compute-0 nova_compute[183177]: 2026-01-26 20:04:34.566 183181 DEBUG oslo_concurrency.lockutils [req-f8583cf2-467d-4b12-b442-d3bf232b5fd3 req-678f32a1-55b5-4d2a-a0fb-e652a88951ef 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:34 compute-0 nova_compute[183177]: 2026-01-26 20:04:34.567 183181 DEBUG nova.compute.manager [req-f8583cf2-467d-4b12-b442-d3bf232b5fd3 req-678f32a1-55b5-4d2a-a0fb-e652a88951ef 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Processing event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:04:35 compute-0 nova_compute[183177]: 2026-01-26 20:04:35.257 183181 DEBUG nova.compute.manager [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:04:35 compute-0 nova_compute[183177]: 2026-01-26 20:04:35.263 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 20:04:35 compute-0 nova_compute[183177]: 2026-01-26 20:04:35.269 183181 INFO nova.virt.libvirt.driver [-] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Instance spawned successfully.
Jan 26 20:04:35 compute-0 nova_compute[183177]: 2026-01-26 20:04:35.269 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 20:04:35 compute-0 podman[214289]: 2026-01-26 20:04:35.371717097 +0000 UTC m=+0.103180368 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 20:04:35 compute-0 podman[214288]: 2026-01-26 20:04:35.381172412 +0000 UTC m=+0.111829081 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., config_id=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350)
Jan 26 20:04:35 compute-0 podman[214287]: 2026-01-26 20:04:35.448088063 +0000 UTC m=+0.176411789 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 20:04:35 compute-0 nova_compute[183177]: 2026-01-26 20:04:35.647 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:35 compute-0 nova_compute[183177]: 2026-01-26 20:04:35.799 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:04:35 compute-0 nova_compute[183177]: 2026-01-26 20:04:35.800 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:04:35 compute-0 nova_compute[183177]: 2026-01-26 20:04:35.800 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:04:35 compute-0 nova_compute[183177]: 2026-01-26 20:04:35.801 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:04:35 compute-0 nova_compute[183177]: 2026-01-26 20:04:35.802 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:04:35 compute-0 nova_compute[183177]: 2026-01-26 20:04:35.802 183181 DEBUG nova.virt.libvirt.driver [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:04:36 compute-0 nova_compute[183177]: 2026-01-26 20:04:36.393 183181 INFO nova.compute.manager [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Took 11.38 seconds to spawn the instance on the hypervisor.
Jan 26 20:04:36 compute-0 nova_compute[183177]: 2026-01-26 20:04:36.393 183181 DEBUG nova.compute.manager [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 20:04:36 compute-0 nova_compute[183177]: 2026-01-26 20:04:36.670 183181 DEBUG nova.compute.manager [req-6fb74d22-8b28-4b68-8721-ad4273bf91e7 req-71a21155-2584-4031-8613-006d9bd37f0b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:04:36 compute-0 nova_compute[183177]: 2026-01-26 20:04:36.671 183181 DEBUG oslo_concurrency.lockutils [req-6fb74d22-8b28-4b68-8721-ad4273bf91e7 req-71a21155-2584-4031-8613-006d9bd37f0b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:04:36 compute-0 nova_compute[183177]: 2026-01-26 20:04:36.671 183181 DEBUG oslo_concurrency.lockutils [req-6fb74d22-8b28-4b68-8721-ad4273bf91e7 req-71a21155-2584-4031-8613-006d9bd37f0b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:04:36 compute-0 nova_compute[183177]: 2026-01-26 20:04:36.672 183181 DEBUG oslo_concurrency.lockutils [req-6fb74d22-8b28-4b68-8721-ad4273bf91e7 req-71a21155-2584-4031-8613-006d9bd37f0b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:36 compute-0 nova_compute[183177]: 2026-01-26 20:04:36.672 183181 DEBUG nova.compute.manager [req-6fb74d22-8b28-4b68-8721-ad4273bf91e7 req-71a21155-2584-4031-8613-006d9bd37f0b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] No waiting events found dispatching network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:04:36 compute-0 nova_compute[183177]: 2026-01-26 20:04:36.672 183181 WARNING nova.compute.manager [req-6fb74d22-8b28-4b68-8721-ad4273bf91e7 req-71a21155-2584-4031-8613-006d9bd37f0b 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received unexpected event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 for instance with vm_state active and task_state None.
Jan 26 20:04:36 compute-0 nova_compute[183177]: 2026-01-26 20:04:36.935 183181 INFO nova.compute.manager [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Took 16.70 seconds to build instance.
Jan 26 20:04:37 compute-0 sshd-session[214351]: Connection closed by authenticating user root 142.93.140.142 port 40862 [preauth]
Jan 26 20:04:37 compute-0 nova_compute[183177]: 2026-01-26 20:04:37.442 183181 DEBUG oslo_concurrency.lockutils [None req-2e603b0e-bff6-455a-b46f-b98f815eaea4 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.222s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:04:37 compute-0 nova_compute[183177]: 2026-01-26 20:04:37.981 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:40 compute-0 nova_compute[183177]: 2026-01-26 20:04:40.649 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:42 compute-0 podman[214353]: 2026-01-26 20:04:42.34605203 +0000 UTC m=+0.085341898 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:04:42 compute-0 nova_compute[183177]: 2026-01-26 20:04:42.983 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:45 compute-0 nova_compute[183177]: 2026-01-26 20:04:45.651 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:47 compute-0 ovn_controller[95396]: 2026-01-26T20:04:47Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:3d:9e 10.100.0.8
Jan 26 20:04:47 compute-0 ovn_controller[95396]: 2026-01-26T20:04:47Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:3d:9e 10.100.0.8
Jan 26 20:04:47 compute-0 nova_compute[183177]: 2026-01-26 20:04:47.986 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:50 compute-0 nova_compute[183177]: 2026-01-26 20:04:50.654 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:52 compute-0 nova_compute[183177]: 2026-01-26 20:04:52.989 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:55 compute-0 nova_compute[183177]: 2026-01-26 20:04:55.658 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:57 compute-0 sshd-session[214386]: Connection closed by authenticating user root 188.166.116.149 port 46626 [preauth]
Jan 26 20:04:57 compute-0 nova_compute[183177]: 2026-01-26 20:04:57.992 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:04:59 compute-0 podman[192499]: time="2026-01-26T20:04:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:04:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:04:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:04:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:04:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2635 "" "Go-http-client/1.1"
Jan 26 20:05:00 compute-0 nova_compute[183177]: 2026-01-26 20:05:00.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:05:00 compute-0 nova_compute[183177]: 2026-01-26 20:05:00.659 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:01 compute-0 openstack_network_exporter[195363]: ERROR   20:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:05:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:05:01 compute-0 openstack_network_exporter[195363]: ERROR   20:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:05:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:05:02 compute-0 nova_compute[183177]: 2026-01-26 20:05:02.994 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:03 compute-0 ovn_controller[95396]: 2026-01-26T20:05:03Z|00207|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 20:05:05 compute-0 nova_compute[183177]: 2026-01-26 20:05:05.662 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:06 compute-0 nova_compute[183177]: 2026-01-26 20:05:06.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:05:06 compute-0 podman[214390]: 2026-01-26 20:05:06.325054867 +0000 UTC m=+0.064748973 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:05:06 compute-0 podman[214389]: 2026-01-26 20:05:06.330131754 +0000 UTC m=+0.073279334 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 20:05:06 compute-0 podman[214388]: 2026-01-26 20:05:06.38644868 +0000 UTC m=+0.122258982 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 20:05:07 compute-0 nova_compute[183177]: 2026-01-26 20:05:07.996 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:09 compute-0 nova_compute[183177]: 2026-01-26 20:05:09.687 183181 DEBUG nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Check if temp file /var/lib/nova/instances/tmpuu3iaret exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 20:05:09 compute-0 nova_compute[183177]: 2026-01-26 20:05:09.693 183181 DEBUG nova.compute.manager [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpuu3iaret',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='74afc852-d448-4b15-b808-f3949eeee83c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 20:05:10 compute-0 nova_compute[183177]: 2026-01-26 20:05:10.122 183181 DEBUG nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Check if temp file /var/lib/nova/instances/tmp9mrq8s7t exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 20:05:10 compute-0 nova_compute[183177]: 2026-01-26 20:05:10.128 183181 DEBUG nova.compute.manager [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9mrq8s7t',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6adcbc8f-e643-4839-981f-0ccabe843c29',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 20:05:10 compute-0 nova_compute[183177]: 2026-01-26 20:05:10.664 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:11 compute-0 nova_compute[183177]: 2026-01-26 20:05:11.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:05:11 compute-0 nova_compute[183177]: 2026-01-26 20:05:11.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:05:11 compute-0 nova_compute[183177]: 2026-01-26 20:05:11.667 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:11 compute-0 nova_compute[183177]: 2026-01-26 20:05:11.668 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:11 compute-0 nova_compute[183177]: 2026-01-26 20:05:11.668 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:11 compute-0 nova_compute[183177]: 2026-01-26 20:05:11.668 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:05:12 compute-0 nova_compute[183177]: 2026-01-26 20:05:12.716 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:05:12 compute-0 nova_compute[183177]: 2026-01-26 20:05:12.778 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:05:12 compute-0 nova_compute[183177]: 2026-01-26 20:05:12.779 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:05:12 compute-0 nova_compute[183177]: 2026-01-26 20:05:12.835 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:05:12 compute-0 nova_compute[183177]: 2026-01-26 20:05:12.844 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:05:12 compute-0 nova_compute[183177]: 2026-01-26 20:05:12.913 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:05:12 compute-0 nova_compute[183177]: 2026-01-26 20:05:12.914 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:05:12 compute-0 nova_compute[183177]: 2026-01-26 20:05:12.968 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:05:12 compute-0 nova_compute[183177]: 2026-01-26 20:05:12.997 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:13 compute-0 nova_compute[183177]: 2026-01-26 20:05:13.167 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:05:13 compute-0 nova_compute[183177]: 2026-01-26 20:05:13.169 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:05:13 compute-0 nova_compute[183177]: 2026-01-26 20:05:13.215 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:05:13 compute-0 nova_compute[183177]: 2026-01-26 20:05:13.216 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5414MB free_disk=73.04000473022461GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:05:13 compute-0 nova_compute[183177]: 2026-01-26 20:05:13.216 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:13 compute-0 nova_compute[183177]: 2026-01-26 20:05:13.216 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:13 compute-0 podman[214470]: 2026-01-26 20:05:13.308378483 +0000 UTC m=+0.058865715 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:05:13 compute-0 nova_compute[183177]: 2026-01-26 20:05:13.866 183181 DEBUG oslo_concurrency.processutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:05:13 compute-0 nova_compute[183177]: 2026-01-26 20:05:13.924 183181 DEBUG oslo_concurrency.processutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:05:13 compute-0 nova_compute[183177]: 2026-01-26 20:05:13.926 183181 DEBUG oslo_concurrency.processutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:05:14 compute-0 nova_compute[183177]: 2026-01-26 20:05:14.005 183181 DEBUG oslo_concurrency.processutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:05:14 compute-0 nova_compute[183177]: 2026-01-26 20:05:14.007 183181 DEBUG nova.compute.manager [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Preparing to wait for external event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:05:14 compute-0 nova_compute[183177]: 2026-01-26 20:05:14.008 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:14 compute-0 nova_compute[183177]: 2026-01-26 20:05:14.009 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:14 compute-0 nova_compute[183177]: 2026-01-26 20:05:14.009 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:14 compute-0 nova_compute[183177]: 2026-01-26 20:05:14.238 183181 INFO nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Updating resource usage from migration ff46a03d-c1c5-4193-aa9b-527250ff32d2
Jan 26 20:05:14 compute-0 nova_compute[183177]: 2026-01-26 20:05:14.239 183181 INFO nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Updating resource usage from migration f042b9a5-f35b-43b9-b8eb-bf10a0230eb4
Jan 26 20:05:14 compute-0 nova_compute[183177]: 2026-01-26 20:05:14.297 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Migration f042b9a5-f35b-43b9-b8eb-bf10a0230eb4 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:05:14 compute-0 nova_compute[183177]: 2026-01-26 20:05:14.298 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Migration ff46a03d-c1c5-4193-aa9b-527250ff32d2 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:05:14 compute-0 nova_compute[183177]: 2026-01-26 20:05:14.298 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:05:14 compute-0 nova_compute[183177]: 2026-01-26 20:05:14.298 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:05:13 up  1:29,  0 user,  load average: 0.37, 0.31, 0.29\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_migrating': '2', 'num_os_type_None': '2', 'num_proj_8d30bc5631f24a6799364d53cb4e9465': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:05:14 compute-0 nova_compute[183177]: 2026-01-26 20:05:14.405 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:05:14 compute-0 sshd-session[214500]: Connection closed by authenticating user root 142.93.140.142 port 41082 [preauth]
Jan 26 20:05:14 compute-0 nova_compute[183177]: 2026-01-26 20:05:14.914 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:05:15 compute-0 nova_compute[183177]: 2026-01-26 20:05:15.424 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:05:15 compute-0 nova_compute[183177]: 2026-01-26 20:05:15.425 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.208s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:15 compute-0 nova_compute[183177]: 2026-01-26 20:05:15.696 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:17 compute-0 nova_compute[183177]: 2026-01-26 20:05:17.425 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:05:17 compute-0 nova_compute[183177]: 2026-01-26 20:05:17.426 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:05:17 compute-0 nova_compute[183177]: 2026-01-26 20:05:17.427 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:05:17 compute-0 nova_compute[183177]: 2026-01-26 20:05:17.427 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:05:18 compute-0 nova_compute[183177]: 2026-01-26 20:05:18.037 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:18 compute-0 nova_compute[183177]: 2026-01-26 20:05:18.155 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:05:19 compute-0 nova_compute[183177]: 2026-01-26 20:05:19.617 183181 DEBUG nova.compute.manager [req-3bd42fb3-09f5-436c-be8e-24a4c2528d1a req-d8c943fd-ce3c-47b6-99d0-b09ea4a156cb 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-unplugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:19 compute-0 nova_compute[183177]: 2026-01-26 20:05:19.618 183181 DEBUG oslo_concurrency.lockutils [req-3bd42fb3-09f5-436c-be8e-24a4c2528d1a req-d8c943fd-ce3c-47b6-99d0-b09ea4a156cb 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:19 compute-0 nova_compute[183177]: 2026-01-26 20:05:19.618 183181 DEBUG oslo_concurrency.lockutils [req-3bd42fb3-09f5-436c-be8e-24a4c2528d1a req-d8c943fd-ce3c-47b6-99d0-b09ea4a156cb 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:19 compute-0 nova_compute[183177]: 2026-01-26 20:05:19.619 183181 DEBUG oslo_concurrency.lockutils [req-3bd42fb3-09f5-436c-be8e-24a4c2528d1a req-d8c943fd-ce3c-47b6-99d0-b09ea4a156cb 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:19 compute-0 nova_compute[183177]: 2026-01-26 20:05:19.619 183181 DEBUG nova.compute.manager [req-3bd42fb3-09f5-436c-be8e-24a4c2528d1a req-d8c943fd-ce3c-47b6-99d0-b09ea4a156cb 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] No event matching network-vif-unplugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 in dict_keys([('network-vif-plugged', '3c2d86fb-aa08-415f-878e-b675f3b8a580')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 20:05:19 compute-0 nova_compute[183177]: 2026-01-26 20:05:19.620 183181 DEBUG nova.compute.manager [req-3bd42fb3-09f5-436c-be8e-24a4c2528d1a req-d8c943fd-ce3c-47b6-99d0-b09ea4a156cb 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-unplugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:05:20 compute-0 nova_compute[183177]: 2026-01-26 20:05:20.697 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:21 compute-0 nova_compute[183177]: 2026-01-26 20:05:21.041 183181 INFO nova.compute.manager [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Took 7.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 26 20:05:21 compute-0 nova_compute[183177]: 2026-01-26 20:05:21.821 183181 DEBUG nova.compute.manager [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:21 compute-0 nova_compute[183177]: 2026-01-26 20:05:21.821 183181 DEBUG oslo_concurrency.lockutils [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:21 compute-0 nova_compute[183177]: 2026-01-26 20:05:21.821 183181 DEBUG oslo_concurrency.lockutils [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:21 compute-0 nova_compute[183177]: 2026-01-26 20:05:21.821 183181 DEBUG oslo_concurrency.lockutils [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:21 compute-0 nova_compute[183177]: 2026-01-26 20:05:21.822 183181 DEBUG nova.compute.manager [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Processing event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:05:21 compute-0 nova_compute[183177]: 2026-01-26 20:05:21.822 183181 DEBUG nova.compute.manager [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-changed-3c2d86fb-aa08-415f-878e-b675f3b8a580 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:21 compute-0 nova_compute[183177]: 2026-01-26 20:05:21.822 183181 DEBUG nova.compute.manager [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Refreshing instance network info cache due to event network-changed-3c2d86fb-aa08-415f-878e-b675f3b8a580. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:05:21 compute-0 nova_compute[183177]: 2026-01-26 20:05:21.822 183181 DEBUG oslo_concurrency.lockutils [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-74afc852-d448-4b15-b808-f3949eeee83c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:05:21 compute-0 nova_compute[183177]: 2026-01-26 20:05:21.822 183181 DEBUG oslo_concurrency.lockutils [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-74afc852-d448-4b15-b808-f3949eeee83c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:05:21 compute-0 nova_compute[183177]: 2026-01-26 20:05:21.823 183181 DEBUG nova.network.neutron [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Refreshing network info cache for port 3c2d86fb-aa08-415f-878e-b675f3b8a580 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:05:21 compute-0 nova_compute[183177]: 2026-01-26 20:05:21.824 183181 DEBUG nova.compute.manager [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:05:22 compute-0 nova_compute[183177]: 2026-01-26 20:05:22.463 183181 WARNING neutronclient.v2_0.client [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:05:22 compute-0 nova_compute[183177]: 2026-01-26 20:05:22.467 183181 DEBUG nova.compute.manager [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpuu3iaret',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='74afc852-d448-4b15-b808-f3949eeee83c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(f042b9a5-f35b-43b9-b8eb-bf10a0230eb4),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 20:05:22 compute-0 nova_compute[183177]: 2026-01-26 20:05:22.947 183181 WARNING neutronclient.v2_0.client [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:05:22 compute-0 nova_compute[183177]: 2026-01-26 20:05:22.984 183181 DEBUG nova.objects.instance [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid 74afc852-d448-4b15-b808-f3949eeee83c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:05:22 compute-0 nova_compute[183177]: 2026-01-26 20:05:22.985 183181 DEBUG nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 20:05:22 compute-0 nova_compute[183177]: 2026-01-26 20:05:22.986 183181 DEBUG nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:05:22 compute-0 nova_compute[183177]: 2026-01-26 20:05:22.986 183181 DEBUG nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.038 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.373 183181 DEBUG nova.network.neutron [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Updated VIF entry in instance network info cache for port 3c2d86fb-aa08-415f-878e-b675f3b8a580. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.374 183181 DEBUG nova.network.neutron [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Updating instance_info_cache with network_info: [{"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.488 183181 DEBUG nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.489 183181 DEBUG nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.498 183181 DEBUG nova.virt.libvirt.vif [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:03:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1526182368',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1526182',id=26,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:04:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-1diwfc47',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:04:12Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=74afc852-d448-4b15-b808-f3949eeee83c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.498 183181 DEBUG nova.network.os_vif_util [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.500 183181 DEBUG nova.network.os_vif_util [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:c9:7c,bridge_name='br-int',has_traffic_filtering=True,id=3c2d86fb-aa08-415f-878e-b675f3b8a580,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c2d86fb-aa') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.501 183181 DEBUG nova.virt.libvirt.migration [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <mac address="fa:16:3e:26:c9:7c"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <model type="virtio"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <mtu size="1442"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <target dev="tap3c2d86fb-aa"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]: </interface>
Jan 26 20:05:23 compute-0 nova_compute[183177]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.503 183181 DEBUG nova.virt.libvirt.migration [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <name>instance-0000001a</name>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <uuid>74afc852-d448-4b15-b808-f3949eeee83c</uuid>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-1526182368</nova:name>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:04:07</nova:creationTime>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:05:23 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:05:23 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:port uuid="3c2d86fb-aa08-415f-878e-b675f3b8a580">
Jan 26 20:05:23 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <system>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="serial">74afc852-d448-4b15-b808-f3949eeee83c</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="uuid">74afc852-d448-4b15-b808-f3949eeee83c</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </system>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <os>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </os>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <features>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </features>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk.config"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:26:c9:7c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3c2d86fb-aa"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/console.log" append="off"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </target>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/console.log" append="off"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </console>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </input>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <video>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </video>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]: </domain>
Jan 26 20:05:23 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.504 183181 DEBUG nova.virt.libvirt.migration [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <name>instance-0000001a</name>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <uuid>74afc852-d448-4b15-b808-f3949eeee83c</uuid>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-1526182368</nova:name>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:04:07</nova:creationTime>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:05:23 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:05:23 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:port uuid="3c2d86fb-aa08-415f-878e-b675f3b8a580">
Jan 26 20:05:23 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <system>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="serial">74afc852-d448-4b15-b808-f3949eeee83c</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="uuid">74afc852-d448-4b15-b808-f3949eeee83c</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </system>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <os>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </os>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <features>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </features>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk.config"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:26:c9:7c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3c2d86fb-aa"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/console.log" append="off"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </target>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/console.log" append="off"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </console>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </input>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <video>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </video>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]: </domain>
Jan 26 20:05:23 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.504 183181 DEBUG nova.virt.libvirt.migration [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <name>instance-0000001a</name>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <uuid>74afc852-d448-4b15-b808-f3949eeee83c</uuid>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-1526182368</nova:name>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:04:07</nova:creationTime>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:05:23 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:05:23 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <nova:port uuid="3c2d86fb-aa08-415f-878e-b675f3b8a580">
Jan 26 20:05:23 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <system>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="serial">74afc852-d448-4b15-b808-f3949eeee83c</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="uuid">74afc852-d448-4b15-b808-f3949eeee83c</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </system>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <os>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </os>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <features>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </features>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/disk.config"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:26:c9:7c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3c2d86fb-aa"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/console.log" append="off"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:05:23 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       </target>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c/console.log" append="off"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </console>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </input>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <video>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </video>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:05:23 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:05:23 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:05:23 compute-0 nova_compute[183177]: </domain>
Jan 26 20:05:23 compute-0 nova_compute[183177]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.505 183181 DEBUG nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.883 183181 DEBUG oslo_concurrency.lockutils [req-fdc461f7-2345-4be7-a0ea-7f4193ae1ce9 req-f9b86cc9-24e8-4641-a8da-084e7b53c222 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-74afc852-d448-4b15-b808-f3949eeee83c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.991 183181 DEBUG nova.virt.libvirt.migration [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:05:23 compute-0 nova_compute[183177]: 2026-01-26 20:05:23.992 183181 INFO nova.virt.libvirt.migration [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 20:05:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:24.093 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:24.094 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:24.096 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.031 183181 INFO nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 20:05:25 compute-0 kernel: tap3c2d86fb-aa (unregistering): left promiscuous mode
Jan 26 20:05:25 compute-0 NetworkManager[55489]: <info>  [1769457925.5459] device (tap3c2d86fb-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 20:05:25 compute-0 ovn_controller[95396]: 2026-01-26T20:05:25Z|00208|binding|INFO|Releasing lport 3c2d86fb-aa08-415f-878e-b675f3b8a580 from this chassis (sb_readonly=0)
Jan 26 20:05:25 compute-0 ovn_controller[95396]: 2026-01-26T20:05:25Z|00209|binding|INFO|Setting lport 3c2d86fb-aa08-415f-878e-b675f3b8a580 down in Southbound
Jan 26 20:05:25 compute-0 ovn_controller[95396]: 2026-01-26T20:05:25Z|00210|binding|INFO|Removing iface tap3c2d86fb-aa ovn-installed in OVS
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.560 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.566 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:c9:7c 10.100.0.13'], port_security=['fa:16:3e:26:c9:7c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c86a094a-ada8-46ad-9c23-c857e9a7b834'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '74afc852-d448-4b15-b808-f3949eeee83c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d30bc5631f24a6799364d53cb4e9465', 'neutron:revision_number': '10', 'neutron:security_group_ids': '2edde6da-001c-4935-bbf6-81cd72a69f91', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8598e857-e8d7-4ecf-97dd-17cf2e766e3a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=3c2d86fb-aa08-415f-878e-b675f3b8a580) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.567 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 3c2d86fb-aa08-415f-878e-b675f3b8a580 in datapath d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c unbound from our chassis
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.569 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.590 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.608 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[76cd7c87-5b26-47c8-95fb-d29e12d11278]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:25 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Jan 26 20:05:25 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001a.scope: Consumed 16.184s CPU time.
Jan 26 20:05:25 compute-0 systemd-machined[154465]: Machine qemu-19-instance-0000001a terminated.
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.658 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[73a6bd26-5469-4fea-90a3-98f930373de6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.660 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e4d453-cb80-46f6-9e36-8fefa6333e8d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.699 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.702 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[14b914e1-6f75-40b8-8940-21bb03540524]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.722 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f1a60192-a7f1-4caa-b880-e31e94871fda]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f030a0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:c2:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530961, 'reachable_time': 18096, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214528, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.742 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b4fbc5-adf7-4ba4-b068-3be266a872d4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd9f030a0-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 530979, 'tstamp': 530979}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214529, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd9f030a0-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 530983, 'tstamp': 530983}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214529, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.744 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f030a0-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.745 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.751 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.752 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9f030a0-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.752 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.752 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9f030a0-20, col_values=(('external_ids', {'iface-id': '63338c40-169b-4962-a6c8-8ca20b375080'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.752 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:05:25 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:25.753 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f1bf81b1-bb38-440e-8e32-414fa61197d7]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.801 183181 DEBUG nova.virt.libvirt.guest [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.802 183181 INFO nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Migration operation has completed
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.802 183181 INFO nova.compute.manager [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] _post_live_migration() is started..
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.804 183181 DEBUG nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.804 183181 DEBUG nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.804 183181 DEBUG nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.816 183181 WARNING neutronclient.v2_0.client [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:05:25 compute-0 nova_compute[183177]: 2026-01-26 20:05:25.816 183181 WARNING neutronclient.v2_0.client [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:05:26 compute-0 nova_compute[183177]: 2026-01-26 20:05:26.458 183181 DEBUG nova.compute.manager [req-fe0a32f6-9933-4ac3-8a61-c9d19543d2c4 req-bb80a167-ac1c-44be-bf79-d0ffa22fd84e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-unplugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:26 compute-0 nova_compute[183177]: 2026-01-26 20:05:26.459 183181 DEBUG oslo_concurrency.lockutils [req-fe0a32f6-9933-4ac3-8a61-c9d19543d2c4 req-bb80a167-ac1c-44be-bf79-d0ffa22fd84e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:26 compute-0 nova_compute[183177]: 2026-01-26 20:05:26.459 183181 DEBUG oslo_concurrency.lockutils [req-fe0a32f6-9933-4ac3-8a61-c9d19543d2c4 req-bb80a167-ac1c-44be-bf79-d0ffa22fd84e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:26 compute-0 nova_compute[183177]: 2026-01-26 20:05:26.459 183181 DEBUG oslo_concurrency.lockutils [req-fe0a32f6-9933-4ac3-8a61-c9d19543d2c4 req-bb80a167-ac1c-44be-bf79-d0ffa22fd84e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:26 compute-0 nova_compute[183177]: 2026-01-26 20:05:26.459 183181 DEBUG nova.compute.manager [req-fe0a32f6-9933-4ac3-8a61-c9d19543d2c4 req-bb80a167-ac1c-44be-bf79-d0ffa22fd84e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] No waiting events found dispatching network-vif-unplugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:05:26 compute-0 nova_compute[183177]: 2026-01-26 20:05:26.460 183181 DEBUG nova.compute.manager [req-fe0a32f6-9933-4ac3-8a61-c9d19543d2c4 req-bb80a167-ac1c-44be-bf79-d0ffa22fd84e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-unplugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:05:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:26.660 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:05:26 compute-0 nova_compute[183177]: 2026-01-26 20:05:26.661 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:26.662 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:05:26 compute-0 nova_compute[183177]: 2026-01-26 20:05:26.691 183181 DEBUG nova.compute.manager [req-b9e0a036-823e-4f31-af5c-6e79b0f3b84e req-27e8177d-404f-43dc-9735-6069001e0807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-unplugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:26 compute-0 nova_compute[183177]: 2026-01-26 20:05:26.691 183181 DEBUG oslo_concurrency.lockutils [req-b9e0a036-823e-4f31-af5c-6e79b0f3b84e req-27e8177d-404f-43dc-9735-6069001e0807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:26 compute-0 nova_compute[183177]: 2026-01-26 20:05:26.691 183181 DEBUG oslo_concurrency.lockutils [req-b9e0a036-823e-4f31-af5c-6e79b0f3b84e req-27e8177d-404f-43dc-9735-6069001e0807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:26 compute-0 nova_compute[183177]: 2026-01-26 20:05:26.691 183181 DEBUG oslo_concurrency.lockutils [req-b9e0a036-823e-4f31-af5c-6e79b0f3b84e req-27e8177d-404f-43dc-9735-6069001e0807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:26 compute-0 nova_compute[183177]: 2026-01-26 20:05:26.692 183181 DEBUG nova.compute.manager [req-b9e0a036-823e-4f31-af5c-6e79b0f3b84e req-27e8177d-404f-43dc-9735-6069001e0807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] No waiting events found dispatching network-vif-unplugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:05:26 compute-0 nova_compute[183177]: 2026-01-26 20:05:26.692 183181 DEBUG nova.compute.manager [req-b9e0a036-823e-4f31-af5c-6e79b0f3b84e req-27e8177d-404f-43dc-9735-6069001e0807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-unplugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.128 183181 DEBUG nova.network.neutron [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Activated binding for port 3c2d86fb-aa08-415f-878e-b675f3b8a580 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.129 183181 DEBUG nova.compute.manager [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.130 183181 DEBUG nova.virt.libvirt.vif [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:03:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1526182368',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1526182',id=26,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:04:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-1diwfc47',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:05:04Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=74afc852-d448-4b15-b808-f3949eeee83c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.131 183181 DEBUG nova.network.os_vif_util [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "address": "fa:16:3e:26:c9:7c", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c2d86fb-aa", "ovs_interfaceid": "3c2d86fb-aa08-415f-878e-b675f3b8a580", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.132 183181 DEBUG nova.network.os_vif_util [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:c9:7c,bridge_name='br-int',has_traffic_filtering=True,id=3c2d86fb-aa08-415f-878e-b675f3b8a580,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c2d86fb-aa') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.133 183181 DEBUG os_vif [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:c9:7c,bridge_name='br-int',has_traffic_filtering=True,id=3c2d86fb-aa08-415f-878e-b675f3b8a580,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c2d86fb-aa') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.136 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.137 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c2d86fb-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.139 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.142 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.143 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.144 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=dc94bc77-69b8-44a9-bd5f-a3deee00c652) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.146 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:27 compute-0 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.147 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:27 compute-0 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.148 183181 INFO os_vif [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:c9:7c,bridge_name='br-int',has_traffic_filtering=True,id=3c2d86fb-aa08-415f-878e-b675f3b8a580,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c2d86fb-aa')
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.149 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.149 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.149 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.149 183181 DEBUG nova.compute.manager [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.150 183181 INFO nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Deleting instance files /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c_del
Jan 26 20:05:27 compute-0 nova_compute[183177]: 2026-01-26 20:05:27.150 183181 INFO nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Deletion of /var/lib/nova/instances/74afc852-d448-4b15-b808-f3949eeee83c_del complete
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.041 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.605 183181 DEBUG nova.compute.manager [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.607 183181 DEBUG oslo_concurrency.lockutils [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.607 183181 DEBUG oslo_concurrency.lockutils [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.608 183181 DEBUG oslo_concurrency.lockutils [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.608 183181 DEBUG nova.compute.manager [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] No waiting events found dispatching network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.608 183181 WARNING nova.compute.manager [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received unexpected event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 for instance with vm_state active and task_state migrating.
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.608 183181 DEBUG nova.compute.manager [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-unplugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.609 183181 DEBUG oslo_concurrency.lockutils [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.609 183181 DEBUG oslo_concurrency.lockutils [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.609 183181 DEBUG oslo_concurrency.lockutils [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.610 183181 DEBUG nova.compute.manager [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] No waiting events found dispatching network-vif-unplugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.610 183181 DEBUG nova.compute.manager [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-unplugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.610 183181 DEBUG nova.compute.manager [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.610 183181 DEBUG oslo_concurrency.lockutils [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.611 183181 DEBUG oslo_concurrency.lockutils [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.611 183181 DEBUG oslo_concurrency.lockutils [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.611 183181 DEBUG nova.compute.manager [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] No waiting events found dispatching network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.611 183181 WARNING nova.compute.manager [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received unexpected event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 for instance with vm_state active and task_state migrating.
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.612 183181 DEBUG nova.compute.manager [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.612 183181 DEBUG oslo_concurrency.lockutils [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.612 183181 DEBUG oslo_concurrency.lockutils [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.612 183181 DEBUG oslo_concurrency.lockutils [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.613 183181 DEBUG nova.compute.manager [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] No waiting events found dispatching network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:05:28 compute-0 nova_compute[183177]: 2026-01-26 20:05:28.613 183181 WARNING nova.compute.manager [req-e315a388-0508-4cef-8632-d78436adc3e7 req-eac3324b-a4df-437e-9dc2-491e87b31cc1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Received unexpected event network-vif-plugged-3c2d86fb-aa08-415f-878e-b675f3b8a580 for instance with vm_state active and task_state migrating.
Jan 26 20:05:29 compute-0 podman[192499]: time="2026-01-26T20:05:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:05:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:05:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:05:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:05:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Jan 26 20:05:31 compute-0 openstack_network_exporter[195363]: ERROR   20:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:05:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:05:31 compute-0 openstack_network_exporter[195363]: ERROR   20:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:05:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:05:31 compute-0 sshd-session[214549]: Invalid user hduser from 193.32.162.151 port 54584
Jan 26 20:05:31 compute-0 sshd-session[214549]: Connection closed by invalid user hduser 193.32.162.151 port 54584 [preauth]
Jan 26 20:05:32 compute-0 nova_compute[183177]: 2026-01-26 20:05:32.147 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:33 compute-0 nova_compute[183177]: 2026-01-26 20:05:33.044 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:34.663 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:05:35 compute-0 nova_compute[183177]: 2026-01-26 20:05:35.680 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "74afc852-d448-4b15-b808-f3949eeee83c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:35 compute-0 nova_compute[183177]: 2026-01-26 20:05:35.680 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:35 compute-0 nova_compute[183177]: 2026-01-26 20:05:35.681 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "74afc852-d448-4b15-b808-f3949eeee83c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:35 compute-0 sshd-session[214551]: Connection closed by authenticating user root 188.166.116.149 port 34308 [preauth]
Jan 26 20:05:36 compute-0 nova_compute[183177]: 2026-01-26 20:05:36.196 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:36 compute-0 nova_compute[183177]: 2026-01-26 20:05:36.197 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:36 compute-0 nova_compute[183177]: 2026-01-26 20:05:36.197 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:36 compute-0 nova_compute[183177]: 2026-01-26 20:05:36.197 183181 DEBUG nova.compute.resource_tracker [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:05:37 compute-0 nova_compute[183177]: 2026-01-26 20:05:37.150 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:37 compute-0 nova_compute[183177]: 2026-01-26 20:05:37.256 183181 DEBUG oslo_concurrency.processutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:05:37 compute-0 podman[214556]: 2026-01-26 20:05:37.326283719 +0000 UTC m=+0.070741435 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Jan 26 20:05:37 compute-0 nova_compute[183177]: 2026-01-26 20:05:37.349 183181 DEBUG oslo_concurrency.processutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:05:37 compute-0 nova_compute[183177]: 2026-01-26 20:05:37.350 183181 DEBUG oslo_concurrency.processutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:05:37 compute-0 podman[214555]: 2026-01-26 20:05:37.381110525 +0000 UTC m=+0.114148564 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git)
Jan 26 20:05:37 compute-0 podman[214554]: 2026-01-26 20:05:37.38170467 +0000 UTC m=+0.128653434 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260120, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 26 20:05:37 compute-0 nova_compute[183177]: 2026-01-26 20:05:37.409 183181 DEBUG oslo_concurrency.processutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:05:37 compute-0 nova_compute[183177]: 2026-01-26 20:05:37.573 183181 WARNING nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:05:37 compute-0 nova_compute[183177]: 2026-01-26 20:05:37.574 183181 DEBUG oslo_concurrency.processutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:05:37 compute-0 nova_compute[183177]: 2026-01-26 20:05:37.594 183181 DEBUG oslo_concurrency.processutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:05:37 compute-0 nova_compute[183177]: 2026-01-26 20:05:37.595 183181 DEBUG nova.compute.resource_tracker [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5574MB free_disk=73.06918716430664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:05:37 compute-0 nova_compute[183177]: 2026-01-26 20:05:37.595 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:37 compute-0 nova_compute[183177]: 2026-01-26 20:05:37.596 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:38 compute-0 nova_compute[183177]: 2026-01-26 20:05:38.103 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:38 compute-0 nova_compute[183177]: 2026-01-26 20:05:38.616 183181 DEBUG nova.compute.resource_tracker [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance 74afc852-d448-4b15-b808-f3949eeee83c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 20:05:39 compute-0 nova_compute[183177]: 2026-01-26 20:05:39.252 183181 DEBUG nova.compute.resource_tracker [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 20:05:39 compute-0 nova_compute[183177]: 2026-01-26 20:05:39.253 183181 INFO nova.compute.resource_tracker [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Updating resource usage from migration ff46a03d-c1c5-4193-aa9b-527250ff32d2
Jan 26 20:05:39 compute-0 nova_compute[183177]: 2026-01-26 20:05:39.304 183181 DEBUG nova.compute.resource_tracker [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration f042b9a5-f35b-43b9-b8eb-bf10a0230eb4 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:05:39 compute-0 nova_compute[183177]: 2026-01-26 20:05:39.305 183181 DEBUG nova.compute.resource_tracker [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration ff46a03d-c1c5-4193-aa9b-527250ff32d2 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:05:39 compute-0 nova_compute[183177]: 2026-01-26 20:05:39.306 183181 DEBUG nova.compute.resource_tracker [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:05:39 compute-0 nova_compute[183177]: 2026-01-26 20:05:39.306 183181 DEBUG nova.compute.resource_tracker [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:05:37 up  1:29,  0 user,  load average: 0.31, 0.30, 0.28\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_8d30bc5631f24a6799364d53cb4e9465': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:05:39 compute-0 nova_compute[183177]: 2026-01-26 20:05:39.383 183181 DEBUG nova.compute.provider_tree [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:05:39 compute-0 nova_compute[183177]: 2026-01-26 20:05:39.892 183181 DEBUG nova.scheduler.client.report [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:05:40 compute-0 nova_compute[183177]: 2026-01-26 20:05:40.404 183181 DEBUG nova.compute.resource_tracker [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:05:40 compute-0 nova_compute[183177]: 2026-01-26 20:05:40.405 183181 DEBUG oslo_concurrency.lockutils [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.809s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:40 compute-0 nova_compute[183177]: 2026-01-26 20:05:40.428 183181 INFO nova.compute.manager [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 26 20:05:41 compute-0 nova_compute[183177]: 2026-01-26 20:05:41.515 183181 INFO nova.scheduler.client.report [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration f042b9a5-f35b-43b9-b8eb-bf10a0230eb4
Jan 26 20:05:41 compute-0 nova_compute[183177]: 2026-01-26 20:05:41.517 183181 DEBUG nova.virt.libvirt.driver [None req-5632270b-bbcf-47e4-b603-69cf5da2c88b 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 74afc852-d448-4b15-b808-f3949eeee83c] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 20:05:42 compute-0 nova_compute[183177]: 2026-01-26 20:05:42.153 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:42 compute-0 nova_compute[183177]: 2026-01-26 20:05:42.544 183181 DEBUG oslo_concurrency.processutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:05:42 compute-0 nova_compute[183177]: 2026-01-26 20:05:42.627 183181 DEBUG oslo_concurrency.processutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:05:42 compute-0 nova_compute[183177]: 2026-01-26 20:05:42.628 183181 DEBUG oslo_concurrency.processutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:05:42 compute-0 nova_compute[183177]: 2026-01-26 20:05:42.688 183181 DEBUG oslo_concurrency.processutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:05:42 compute-0 nova_compute[183177]: 2026-01-26 20:05:42.690 183181 DEBUG nova.compute.manager [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Preparing to wait for external event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:05:42 compute-0 nova_compute[183177]: 2026-01-26 20:05:42.691 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:42 compute-0 nova_compute[183177]: 2026-01-26 20:05:42.692 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:42 compute-0 nova_compute[183177]: 2026-01-26 20:05:42.692 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:43 compute-0 nova_compute[183177]: 2026-01-26 20:05:43.141 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:44 compute-0 podman[214634]: 2026-01-26 20:05:44.316708544 +0000 UTC m=+0.057141439 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 20:05:47 compute-0 nova_compute[183177]: 2026-01-26 20:05:47.158 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:48 compute-0 nova_compute[183177]: 2026-01-26 20:05:48.144 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:48 compute-0 nova_compute[183177]: 2026-01-26 20:05:48.788 183181 DEBUG nova.compute.manager [req-606655f1-38ea-4ef6-a4f4-7f313a70050a req-494a7d78-2774-4aab-aad3-7ab8d021656f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-unplugged-2220e11e-047f-472c-b3b5-fc824d195b31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:48 compute-0 nova_compute[183177]: 2026-01-26 20:05:48.788 183181 DEBUG oslo_concurrency.lockutils [req-606655f1-38ea-4ef6-a4f4-7f313a70050a req-494a7d78-2774-4aab-aad3-7ab8d021656f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:48 compute-0 nova_compute[183177]: 2026-01-26 20:05:48.788 183181 DEBUG oslo_concurrency.lockutils [req-606655f1-38ea-4ef6-a4f4-7f313a70050a req-494a7d78-2774-4aab-aad3-7ab8d021656f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:48 compute-0 nova_compute[183177]: 2026-01-26 20:05:48.789 183181 DEBUG oslo_concurrency.lockutils [req-606655f1-38ea-4ef6-a4f4-7f313a70050a req-494a7d78-2774-4aab-aad3-7ab8d021656f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:48 compute-0 nova_compute[183177]: 2026-01-26 20:05:48.789 183181 DEBUG nova.compute.manager [req-606655f1-38ea-4ef6-a4f4-7f313a70050a req-494a7d78-2774-4aab-aad3-7ab8d021656f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] No event matching network-vif-unplugged-2220e11e-047f-472c-b3b5-fc824d195b31 in dict_keys([('network-vif-plugged', '2220e11e-047f-472c-b3b5-fc824d195b31')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 20:05:48 compute-0 nova_compute[183177]: 2026-01-26 20:05:48.789 183181 DEBUG nova.compute.manager [req-606655f1-38ea-4ef6-a4f4-7f313a70050a req-494a7d78-2774-4aab-aad3-7ab8d021656f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-unplugged-2220e11e-047f-472c-b3b5-fc824d195b31 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:05:50 compute-0 nova_compute[183177]: 2026-01-26 20:05:50.216 183181 INFO nova.compute.manager [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Took 7.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 26 20:05:50 compute-0 nova_compute[183177]: 2026-01-26 20:05:50.891 183181 DEBUG nova.compute.manager [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:50 compute-0 nova_compute[183177]: 2026-01-26 20:05:50.892 183181 DEBUG oslo_concurrency.lockutils [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:50 compute-0 nova_compute[183177]: 2026-01-26 20:05:50.893 183181 DEBUG oslo_concurrency.lockutils [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:50 compute-0 nova_compute[183177]: 2026-01-26 20:05:50.893 183181 DEBUG oslo_concurrency.lockutils [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:50 compute-0 nova_compute[183177]: 2026-01-26 20:05:50.894 183181 DEBUG nova.compute.manager [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Processing event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:05:50 compute-0 nova_compute[183177]: 2026-01-26 20:05:50.895 183181 DEBUG nova.compute.manager [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-changed-2220e11e-047f-472c-b3b5-fc824d195b31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:50 compute-0 nova_compute[183177]: 2026-01-26 20:05:50.895 183181 DEBUG nova.compute.manager [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Refreshing instance network info cache due to event network-changed-2220e11e-047f-472c-b3b5-fc824d195b31. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:05:50 compute-0 nova_compute[183177]: 2026-01-26 20:05:50.896 183181 DEBUG oslo_concurrency.lockutils [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-6adcbc8f-e643-4839-981f-0ccabe843c29" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:05:50 compute-0 nova_compute[183177]: 2026-01-26 20:05:50.897 183181 DEBUG oslo_concurrency.lockutils [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-6adcbc8f-e643-4839-981f-0ccabe843c29" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:05:50 compute-0 nova_compute[183177]: 2026-01-26 20:05:50.897 183181 DEBUG nova.network.neutron [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Refreshing network info cache for port 2220e11e-047f-472c-b3b5-fc824d195b31 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:05:50 compute-0 nova_compute[183177]: 2026-01-26 20:05:50.900 183181 DEBUG nova.compute.manager [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:05:51 compute-0 nova_compute[183177]: 2026-01-26 20:05:51.432 183181 WARNING neutronclient.v2_0.client [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:05:51 compute-0 nova_compute[183177]: 2026-01-26 20:05:51.437 183181 DEBUG nova.compute.manager [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9mrq8s7t',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6adcbc8f-e643-4839-981f-0ccabe843c29',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(ff46a03d-c1c5-4193-aa9b-527250ff32d2),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 20:05:51 compute-0 nova_compute[183177]: 2026-01-26 20:05:51.951 183181 DEBUG nova.objects.instance [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid 6adcbc8f-e643-4839-981f-0ccabe843c29 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:05:51 compute-0 nova_compute[183177]: 2026-01-26 20:05:51.952 183181 DEBUG nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 20:05:51 compute-0 nova_compute[183177]: 2026-01-26 20:05:51.953 183181 DEBUG nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:05:51 compute-0 nova_compute[183177]: 2026-01-26 20:05:51.954 183181 DEBUG nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.190 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.456 183181 DEBUG nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.457 183181 DEBUG nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.468 183181 DEBUG nova.virt.libvirt.vif [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:04:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-2107227729',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-2107227',id=27,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:04:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-w07hfiyx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:04:36Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=6adcbc8f-e643-4839-981f-0ccabe843c29,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.468 183181 DEBUG nova.network.os_vif_util [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.469 183181 DEBUG nova.network.os_vif_util [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:3d:9e,bridge_name='br-int',has_traffic_filtering=True,id=2220e11e-047f-472c-b3b5-fc824d195b31,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2220e11e-04') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.470 183181 DEBUG nova.virt.libvirt.migration [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <mac address="fa:16:3e:ce:3d:9e"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <model type="virtio"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <mtu size="1442"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <target dev="tap2220e11e-04"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]: </interface>
Jan 26 20:05:52 compute-0 nova_compute[183177]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.471 183181 DEBUG nova.virt.libvirt.migration [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <name>instance-0000001b</name>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <uuid>6adcbc8f-e643-4839-981f-0ccabe843c29</uuid>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-2107227729</nova:name>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:04:30</nova:creationTime>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:05:52 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:05:52 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:port uuid="2220e11e-047f-472c-b3b5-fc824d195b31">
Jan 26 20:05:52 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <system>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="serial">6adcbc8f-e643-4839-981f-0ccabe843c29</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="uuid">6adcbc8f-e643-4839-981f-0ccabe843c29</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </system>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <os>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </os>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <features>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </features>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk.config"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:ce:3d:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2220e11e-04"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/console.log" append="off"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </target>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/console.log" append="off"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </console>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </input>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <video>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </video>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]: </domain>
Jan 26 20:05:52 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.471 183181 DEBUG nova.virt.libvirt.migration [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <name>instance-0000001b</name>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <uuid>6adcbc8f-e643-4839-981f-0ccabe843c29</uuid>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-2107227729</nova:name>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:04:30</nova:creationTime>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:05:52 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:05:52 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:port uuid="2220e11e-047f-472c-b3b5-fc824d195b31">
Jan 26 20:05:52 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <system>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="serial">6adcbc8f-e643-4839-981f-0ccabe843c29</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="uuid">6adcbc8f-e643-4839-981f-0ccabe843c29</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </system>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <os>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </os>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <features>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </features>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk.config"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:ce:3d:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2220e11e-04"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/console.log" append="off"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </target>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/console.log" append="off"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </console>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </input>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <video>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </video>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]: </domain>
Jan 26 20:05:52 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.472 183181 DEBUG nova.virt.libvirt.migration [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <name>instance-0000001b</name>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <uuid>6adcbc8f-e643-4839-981f-0ccabe843c29</uuid>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-2107227729</nova:name>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:04:30</nova:creationTime>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:05:52 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:05:52 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <nova:port uuid="2220e11e-047f-472c-b3b5-fc824d195b31">
Jan 26 20:05:52 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <system>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="serial">6adcbc8f-e643-4839-981f-0ccabe843c29</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="uuid">6adcbc8f-e643-4839-981f-0ccabe843c29</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </system>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <os>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </os>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <features>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </features>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/disk.config"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:ce:3d:9e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap2220e11e-04"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/console.log" append="off"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:05:52 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       </target>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29/console.log" append="off"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </console>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </input>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <video>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </video>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:05:52 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:05:52 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:05:52 compute-0 nova_compute[183177]: </domain>
Jan 26 20:05:52 compute-0 nova_compute[183177]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.472 183181 DEBUG nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.536 183181 WARNING neutronclient.v2_0.client [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.723 183181 DEBUG nova.network.neutron [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Updated VIF entry in instance network info cache for port 2220e11e-047f-472c-b3b5-fc824d195b31. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.724 183181 DEBUG nova.network.neutron [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Updating instance_info_cache with network_info: [{"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:05:52 compute-0 sshd-session[214673]: Connection closed by authenticating user root 142.93.140.142 port 49750 [preauth]
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.958 183181 DEBUG nova.virt.libvirt.migration [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:05:52 compute-0 nova_compute[183177]: 2026-01-26 20:05:52.959 183181 INFO nova.virt.libvirt.migration [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 20:05:53 compute-0 nova_compute[183177]: 2026-01-26 20:05:53.146 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:53 compute-0 nova_compute[183177]: 2026-01-26 20:05:53.231 183181 DEBUG oslo_concurrency.lockutils [req-3b78c852-682c-40f6-82a7-1b9940a1f7c9 req-084486ec-a8b3-4070-be78-973e0c23e6ce 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-6adcbc8f-e643-4839-981f-0ccabe843c29" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:05:53 compute-0 nova_compute[183177]: 2026-01-26 20:05:53.990 183181 INFO nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 20:05:54 compute-0 nova_compute[183177]: 2026-01-26 20:05:54.494 183181 DEBUG nova.virt.libvirt.migration [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:05:54 compute-0 nova_compute[183177]: 2026-01-26 20:05:54.494 183181 DEBUG nova.virt.libvirt.migration [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 20:05:54 compute-0 kernel: tap2220e11e-04 (unregistering): left promiscuous mode
Jan 26 20:05:54 compute-0 NetworkManager[55489]: <info>  [1769457954.5620] device (tap2220e11e-04): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 20:05:54 compute-0 ovn_controller[95396]: 2026-01-26T20:05:54Z|00211|binding|INFO|Releasing lport 2220e11e-047f-472c-b3b5-fc824d195b31 from this chassis (sb_readonly=0)
Jan 26 20:05:54 compute-0 ovn_controller[95396]: 2026-01-26T20:05:54Z|00212|binding|INFO|Setting lport 2220e11e-047f-472c-b3b5-fc824d195b31 down in Southbound
Jan 26 20:05:54 compute-0 ovn_controller[95396]: 2026-01-26T20:05:54Z|00213|binding|INFO|Removing iface tap2220e11e-04 ovn-installed in OVS
Jan 26 20:05:54 compute-0 nova_compute[183177]: 2026-01-26 20:05:54.566 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:54 compute-0 nova_compute[183177]: 2026-01-26 20:05:54.567 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.578 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:3d:9e 10.100.0.8'], port_security=['fa:16:3e:ce:3d:9e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c86a094a-ada8-46ad-9c23-c857e9a7b834'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '6adcbc8f-e643-4839-981f-0ccabe843c29', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d30bc5631f24a6799364d53cb4e9465', 'neutron:revision_number': '10', 'neutron:security_group_ids': '2edde6da-001c-4935-bbf6-81cd72a69f91', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8598e857-e8d7-4ecf-97dd-17cf2e766e3a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=2220e11e-047f-472c-b3b5-fc824d195b31) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.579 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 2220e11e-047f-472c-b3b5-fc824d195b31 in datapath d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c unbound from our chassis
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.581 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 20:05:54 compute-0 nova_compute[183177]: 2026-01-26 20:05:54.583 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.583 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8cf583e8-9373-49d8-9ddc-0615d781f508]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.584 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c namespace which is not needed anymore
Jan 26 20:05:54 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Jan 26 20:05:54 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001b.scope: Consumed 17.569s CPU time.
Jan 26 20:05:54 compute-0 systemd-machined[154465]: Machine qemu-20-instance-0000001b terminated.
Jan 26 20:05:54 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[214157]: [NOTICE]   (214182) : haproxy version is 3.0.5-8e879a5
Jan 26 20:05:54 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[214157]: [NOTICE]   (214182) : path to executable is /usr/sbin/haproxy
Jan 26 20:05:54 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[214157]: [WARNING]  (214182) : Exiting Master process...
Jan 26 20:05:54 compute-0 podman[214703]: 2026-01-26 20:05:54.686272959 +0000 UTC m=+0.027544462 container kill 5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:05:54 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[214157]: [ALERT]    (214182) : Current worker (214184) exited with code 143 (Terminated)
Jan 26 20:05:54 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[214157]: [WARNING]  (214182) : All workers exited. Exiting... (0)
Jan 26 20:05:54 compute-0 systemd[1]: libpod-5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0.scope: Deactivated successfully.
Jan 26 20:05:54 compute-0 podman[214717]: 2026-01-26 20:05:54.722050412 +0000 UTC m=+0.020118762 container died 5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 26 20:05:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0-userdata-shm.mount: Deactivated successfully.
Jan 26 20:05:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-64f6573c5eb3cb04ca1da582602a0215d076a1dace4febd82cfd79a6248d8c3b-merged.mount: Deactivated successfully.
Jan 26 20:05:54 compute-0 podman[214717]: 2026-01-26 20:05:54.769053827 +0000 UTC m=+0.067122137 container cleanup 5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 26 20:05:54 compute-0 systemd[1]: libpod-conmon-5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0.scope: Deactivated successfully.
Jan 26 20:05:54 compute-0 podman[214724]: 2026-01-26 20:05:54.789283672 +0000 UTC m=+0.065977347 container remove 5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.794 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1232e9f4-edd4-49a1-8bab-f9b8c09b0198]: (4, ("Mon Jan 26 08:05:54 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c (5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0)\n5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0\nMon Jan 26 08:05:54 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c (5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0)\n5b0ca3e484094914284aa60077a16ccf80eaa2aa4691335a71d613bcd505a1a0\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.796 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b5afd417-84f0-467d-9556-2623a1cec6c7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.796 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.797 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b29e4340-df4a-4299-a07a-110b629d42c7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.797 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f030a0-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:05:54 compute-0 nova_compute[183177]: 2026-01-26 20:05:54.799 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:54 compute-0 kernel: tapd9f030a0-20: left promiscuous mode
Jan 26 20:05:54 compute-0 nova_compute[183177]: 2026-01-26 20:05:54.809 183181 DEBUG nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 20:05:54 compute-0 nova_compute[183177]: 2026-01-26 20:05:54.809 183181 DEBUG nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 20:05:54 compute-0 nova_compute[183177]: 2026-01-26 20:05:54.810 183181 DEBUG nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 20:05:54 compute-0 nova_compute[183177]: 2026-01-26 20:05:54.813 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.815 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[51c5e934-2b57-459a-99e1-9d901de2d00c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.825 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e376ba-b2b2-4fad-9ca6-130ea8edb032]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.826 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[5cd891df-e554-4e7f-a67e-06736fb47bc9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.842 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[51b9a796-bd40-46da-9a20-8195074eed3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530953, 'reachable_time': 19439, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214770, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.843 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 20:05:54 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:05:54.844 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[49ad5c00-6fc6-4337-8553-18d09ed940c2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:05:54 compute-0 systemd[1]: run-netns-ovnmeta\x2dd9f030a0\x2d2e80\x2d4f5c\x2d97ab\x2deb7e0b1edd6c.mount: Deactivated successfully.
Jan 26 20:05:54 compute-0 nova_compute[183177]: 2026-01-26 20:05:54.996 183181 DEBUG nova.virt.libvirt.guest [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '6adcbc8f-e643-4839-981f-0ccabe843c29' (instance-0000001b) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 20:05:54 compute-0 nova_compute[183177]: 2026-01-26 20:05:54.997 183181 INFO nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Migration operation has completed
Jan 26 20:05:54 compute-0 nova_compute[183177]: 2026-01-26 20:05:54.997 183181 INFO nova.compute.manager [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] _post_live_migration() is started..
Jan 26 20:05:55 compute-0 nova_compute[183177]: 2026-01-26 20:05:55.012 183181 WARNING neutronclient.v2_0.client [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:05:55 compute-0 nova_compute[183177]: 2026-01-26 20:05:55.012 183181 WARNING neutronclient.v2_0.client [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:05:55 compute-0 nova_compute[183177]: 2026-01-26 20:05:55.355 183181 DEBUG nova.compute.manager [req-ea5287a7-2ae1-4686-8b24-39931bb57352 req-eec8138b-04db-4d96-8fc2-eafee580078f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-unplugged-2220e11e-047f-472c-b3b5-fc824d195b31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:55 compute-0 nova_compute[183177]: 2026-01-26 20:05:55.355 183181 DEBUG oslo_concurrency.lockutils [req-ea5287a7-2ae1-4686-8b24-39931bb57352 req-eec8138b-04db-4d96-8fc2-eafee580078f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:55 compute-0 nova_compute[183177]: 2026-01-26 20:05:55.355 183181 DEBUG oslo_concurrency.lockutils [req-ea5287a7-2ae1-4686-8b24-39931bb57352 req-eec8138b-04db-4d96-8fc2-eafee580078f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:55 compute-0 nova_compute[183177]: 2026-01-26 20:05:55.355 183181 DEBUG oslo_concurrency.lockutils [req-ea5287a7-2ae1-4686-8b24-39931bb57352 req-eec8138b-04db-4d96-8fc2-eafee580078f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:55 compute-0 nova_compute[183177]: 2026-01-26 20:05:55.356 183181 DEBUG nova.compute.manager [req-ea5287a7-2ae1-4686-8b24-39931bb57352 req-eec8138b-04db-4d96-8fc2-eafee580078f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] No waiting events found dispatching network-vif-unplugged-2220e11e-047f-472c-b3b5-fc824d195b31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:05:55 compute-0 nova_compute[183177]: 2026-01-26 20:05:55.356 183181 DEBUG nova.compute.manager [req-ea5287a7-2ae1-4686-8b24-39931bb57352 req-eec8138b-04db-4d96-8fc2-eafee580078f 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-unplugged-2220e11e-047f-472c-b3b5-fc824d195b31 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.401 183181 DEBUG nova.network.neutron [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Activated binding for port 2220e11e-047f-472c-b3b5-fc824d195b31 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.402 183181 DEBUG nova.compute.manager [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.404 183181 DEBUG nova.virt.libvirt.vif [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:04:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-2107227729',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-2107227',id=27,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:04:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-w07hfiyx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:05:04Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=6adcbc8f-e643-4839-981f-0ccabe843c29,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.404 183181 DEBUG nova.network.os_vif_util [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "2220e11e-047f-472c-b3b5-fc824d195b31", "address": "fa:16:3e:ce:3d:9e", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2220e11e-04", "ovs_interfaceid": "2220e11e-047f-472c-b3b5-fc824d195b31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.405 183181 DEBUG nova.network.os_vif_util [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:3d:9e,bridge_name='br-int',has_traffic_filtering=True,id=2220e11e-047f-472c-b3b5-fc824d195b31,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2220e11e-04') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.406 183181 DEBUG os_vif [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:3d:9e,bridge_name='br-int',has_traffic_filtering=True,id=2220e11e-047f-472c-b3b5-fc824d195b31,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2220e11e-04') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.408 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.409 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2220e11e-04, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.411 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.413 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.414 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.414 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=67a698de-7be0-4b5a-9022-c275ded39d1d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.415 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.416 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.419 183181 INFO os_vif [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:3d:9e,bridge_name='br-int',has_traffic_filtering=True,id=2220e11e-047f-472c-b3b5-fc824d195b31,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2220e11e-04')
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.420 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.420 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.421 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.421 183181 DEBUG nova.compute.manager [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.422 183181 INFO nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Deleting instance files /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29_del
Jan 26 20:05:56 compute-0 nova_compute[183177]: 2026-01-26 20:05:56.423 183181 INFO nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Deletion of /var/lib/nova/instances/6adcbc8f-e643-4839-981f-0ccabe843c29_del complete
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.479 183181 DEBUG nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.480 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.480 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.481 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.481 183181 DEBUG nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] No waiting events found dispatching network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.482 183181 WARNING nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received unexpected event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 for instance with vm_state active and task_state migrating.
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.482 183181 DEBUG nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-unplugged-2220e11e-047f-472c-b3b5-fc824d195b31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.483 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.484 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.484 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.485 183181 DEBUG nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] No waiting events found dispatching network-vif-unplugged-2220e11e-047f-472c-b3b5-fc824d195b31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.485 183181 DEBUG nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-unplugged-2220e11e-047f-472c-b3b5-fc824d195b31 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.485 183181 DEBUG nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-unplugged-2220e11e-047f-472c-b3b5-fc824d195b31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.486 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.486 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.486 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.487 183181 DEBUG nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] No waiting events found dispatching network-vif-unplugged-2220e11e-047f-472c-b3b5-fc824d195b31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.487 183181 DEBUG nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-unplugged-2220e11e-047f-472c-b3b5-fc824d195b31 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.487 183181 DEBUG nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.488 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.488 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.489 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.489 183181 DEBUG nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] No waiting events found dispatching network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.489 183181 WARNING nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received unexpected event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 for instance with vm_state active and task_state migrating.
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.490 183181 DEBUG nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.490 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.490 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.491 183181 DEBUG oslo_concurrency.lockutils [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.491 183181 DEBUG nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] No waiting events found dispatching network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:05:57 compute-0 nova_compute[183177]: 2026-01-26 20:05:57.491 183181 WARNING nova.compute.manager [req-dc4c020e-0ab4-4799-abdf-0bcb04748b69 req-e41efc20-e955-4cda-9822-e7bf6e083b83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Received unexpected event network-vif-plugged-2220e11e-047f-472c-b3b5-fc824d195b31 for instance with vm_state active and task_state migrating.
Jan 26 20:05:58 compute-0 nova_compute[183177]: 2026-01-26 20:05:58.148 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:05:58 compute-0 sshd-session[214771]: Connection closed by 101.126.147.62 port 62234
Jan 26 20:05:59 compute-0 podman[192499]: time="2026-01-26T20:05:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:05:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:05:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:05:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:05:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Jan 26 20:06:01 compute-0 openstack_network_exporter[195363]: ERROR   20:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:06:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:06:01 compute-0 openstack_network_exporter[195363]: ERROR   20:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:06:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:06:01 compute-0 nova_compute[183177]: 2026-01-26 20:06:01.418 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:02 compute-0 nova_compute[183177]: 2026-01-26 20:06:02.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:06:03 compute-0 nova_compute[183177]: 2026-01-26 20:06:03.150 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:06 compute-0 nova_compute[183177]: 2026-01-26 20:06:06.420 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:06 compute-0 nova_compute[183177]: 2026-01-26 20:06:06.475 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:06 compute-0 nova_compute[183177]: 2026-01-26 20:06:06.476 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:06 compute-0 nova_compute[183177]: 2026-01-26 20:06:06.476 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "6adcbc8f-e643-4839-981f-0ccabe843c29-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:06 compute-0 nova_compute[183177]: 2026-01-26 20:06:06.991 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:06 compute-0 nova_compute[183177]: 2026-01-26 20:06:06.991 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:06 compute-0 nova_compute[183177]: 2026-01-26 20:06:06.992 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:06 compute-0 nova_compute[183177]: 2026-01-26 20:06:06.992 183181 DEBUG nova.compute.resource_tracker [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:06:07 compute-0 nova_compute[183177]: 2026-01-26 20:06:07.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:06:07 compute-0 nova_compute[183177]: 2026-01-26 20:06:07.186 183181 WARNING nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:06:07 compute-0 nova_compute[183177]: 2026-01-26 20:06:07.187 183181 DEBUG oslo_concurrency.processutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:06:07 compute-0 nova_compute[183177]: 2026-01-26 20:06:07.215 183181 DEBUG oslo_concurrency.processutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:06:07 compute-0 nova_compute[183177]: 2026-01-26 20:06:07.216 183181 DEBUG nova.compute.resource_tracker [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5715MB free_disk=73.09809875488281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:06:07 compute-0 nova_compute[183177]: 2026-01-26 20:06:07.217 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:07 compute-0 nova_compute[183177]: 2026-01-26 20:06:07.217 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:08 compute-0 nova_compute[183177]: 2026-01-26 20:06:08.198 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:08 compute-0 podman[214777]: 2026-01-26 20:06:08.349025758 +0000 UTC m=+0.083067557 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Jan 26 20:06:08 compute-0 nova_compute[183177]: 2026-01-26 20:06:08.354 183181 DEBUG nova.compute.resource_tracker [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance 6adcbc8f-e643-4839-981f-0ccabe843c29 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 20:06:08 compute-0 podman[214778]: 2026-01-26 20:06:08.366113649 +0000 UTC m=+0.092189873 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 20:06:08 compute-0 podman[214776]: 2026-01-26 20:06:08.39998588 +0000 UTC m=+0.134643085 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Jan 26 20:06:08 compute-0 nova_compute[183177]: 2026-01-26 20:06:08.862 183181 DEBUG nova.compute.resource_tracker [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 20:06:08 compute-0 nova_compute[183177]: 2026-01-26 20:06:08.900 183181 DEBUG nova.compute.resource_tracker [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration ff46a03d-c1c5-4193-aa9b-527250ff32d2 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:06:08 compute-0 nova_compute[183177]: 2026-01-26 20:06:08.901 183181 DEBUG nova.compute.resource_tracker [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:06:08 compute-0 nova_compute[183177]: 2026-01-26 20:06:08.902 183181 DEBUG nova.compute.resource_tracker [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:06:07 up  1:30,  0 user,  load average: 0.35, 0.30, 0.28\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:06:08 compute-0 nova_compute[183177]: 2026-01-26 20:06:08.953 183181 DEBUG nova.compute.provider_tree [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:06:09 compute-0 nova_compute[183177]: 2026-01-26 20:06:09.461 183181 DEBUG nova.scheduler.client.report [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:06:09 compute-0 nova_compute[183177]: 2026-01-26 20:06:09.986 183181 DEBUG nova.compute.resource_tracker [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:06:09 compute-0 nova_compute[183177]: 2026-01-26 20:06:09.986 183181 DEBUG oslo_concurrency.lockutils [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.769s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:10 compute-0 nova_compute[183177]: 2026-01-26 20:06:10.013 183181 INFO nova.compute.manager [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 26 20:06:11 compute-0 nova_compute[183177]: 2026-01-26 20:06:11.120 183181 INFO nova.scheduler.client.report [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration ff46a03d-c1c5-4193-aa9b-527250ff32d2
Jan 26 20:06:11 compute-0 nova_compute[183177]: 2026-01-26 20:06:11.120 183181 DEBUG nova.virt.libvirt.driver [None req-f33e3d2b-acd5-4c1c-87e2-850c54dd5600 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 6adcbc8f-e643-4839-981f-0ccabe843c29] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 20:06:11 compute-0 nova_compute[183177]: 2026-01-26 20:06:11.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:06:11 compute-0 nova_compute[183177]: 2026-01-26 20:06:11.423 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:13 compute-0 nova_compute[183177]: 2026-01-26 20:06:13.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:06:13 compute-0 nova_compute[183177]: 2026-01-26 20:06:13.219 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:13 compute-0 nova_compute[183177]: 2026-01-26 20:06:13.664 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:13 compute-0 nova_compute[183177]: 2026-01-26 20:06:13.664 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:13 compute-0 nova_compute[183177]: 2026-01-26 20:06:13.665 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:13 compute-0 nova_compute[183177]: 2026-01-26 20:06:13.665 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:06:13 compute-0 nova_compute[183177]: 2026-01-26 20:06:13.855 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:06:13 compute-0 nova_compute[183177]: 2026-01-26 20:06:13.856 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:06:13 compute-0 nova_compute[183177]: 2026-01-26 20:06:13.876 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:06:13 compute-0 nova_compute[183177]: 2026-01-26 20:06:13.877 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5717MB free_disk=73.09809875488281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:06:13 compute-0 nova_compute[183177]: 2026-01-26 20:06:13.878 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:13 compute-0 nova_compute[183177]: 2026-01-26 20:06:13.879 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:14 compute-0 nova_compute[183177]: 2026-01-26 20:06:14.919 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:06:14 compute-0 nova_compute[183177]: 2026-01-26 20:06:14.919 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:06:13 up  1:30,  0 user,  load average: 0.32, 0.30, 0.28\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:06:14 compute-0 nova_compute[183177]: 2026-01-26 20:06:14.945 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:06:15 compute-0 podman[214837]: 2026-01-26 20:06:15.306727944 +0000 UTC m=+0.056785100 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 20:06:15 compute-0 nova_compute[183177]: 2026-01-26 20:06:15.452 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:06:15 compute-0 nova_compute[183177]: 2026-01-26 20:06:15.965 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:06:15 compute-0 nova_compute[183177]: 2026-01-26 20:06:15.966 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:15 compute-0 sshd-session[214861]: Connection closed by authenticating user root 188.166.116.149 port 44244 [preauth]
Jan 26 20:06:16 compute-0 nova_compute[183177]: 2026-01-26 20:06:16.426 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:16 compute-0 nova_compute[183177]: 2026-01-26 20:06:16.967 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:06:16 compute-0 nova_compute[183177]: 2026-01-26 20:06:16.968 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:06:17 compute-0 nova_compute[183177]: 2026-01-26 20:06:17.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:06:18 compute-0 nova_compute[183177]: 2026-01-26 20:06:18.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:06:18 compute-0 nova_compute[183177]: 2026-01-26 20:06:18.224 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:20 compute-0 nova_compute[183177]: 2026-01-26 20:06:20.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:06:21 compute-0 nova_compute[183177]: 2026-01-26 20:06:21.429 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:23 compute-0 nova_compute[183177]: 2026-01-26 20:06:23.269 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:24.097 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:24.098 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:24.098 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:24 compute-0 nova_compute[183177]: 2026-01-26 20:06:24.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:06:26 compute-0 nova_compute[183177]: 2026-01-26 20:06:26.431 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:28 compute-0 nova_compute[183177]: 2026-01-26 20:06:28.302 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:29 compute-0 podman[192499]: time="2026-01-26T20:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:06:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:06:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Jan 26 20:06:31 compute-0 nova_compute[183177]: 2026-01-26 20:06:31.432 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:31 compute-0 openstack_network_exporter[195363]: ERROR   20:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:06:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:06:31 compute-0 openstack_network_exporter[195363]: ERROR   20:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:06:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:06:32 compute-0 sshd-session[214864]: Connection closed by authenticating user root 142.93.140.142 port 48538 [preauth]
Jan 26 20:06:33 compute-0 nova_compute[183177]: 2026-01-26 20:06:33.303 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:33 compute-0 nova_compute[183177]: 2026-01-26 20:06:33.882 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:33 compute-0 nova_compute[183177]: 2026-01-26 20:06:33.882 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:34 compute-0 nova_compute[183177]: 2026-01-26 20:06:34.388 183181 DEBUG nova.compute.manager [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 20:06:34 compute-0 nova_compute[183177]: 2026-01-26 20:06:34.951 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:34 compute-0 nova_compute[183177]: 2026-01-26 20:06:34.952 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:34 compute-0 nova_compute[183177]: 2026-01-26 20:06:34.962 183181 DEBUG nova.virt.hardware [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 20:06:34 compute-0 nova_compute[183177]: 2026-01-26 20:06:34.963 183181 INFO nova.compute.claims [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Claim successful on node compute-0.ctlplane.example.com
Jan 26 20:06:36 compute-0 nova_compute[183177]: 2026-01-26 20:06:36.039 183181 DEBUG nova.compute.provider_tree [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:06:36 compute-0 nova_compute[183177]: 2026-01-26 20:06:36.435 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:36 compute-0 nova_compute[183177]: 2026-01-26 20:06:36.547 183181 DEBUG nova.scheduler.client.report [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:06:37 compute-0 nova_compute[183177]: 2026-01-26 20:06:37.060 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:37 compute-0 nova_compute[183177]: 2026-01-26 20:06:37.060 183181 DEBUG nova.compute.manager [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 20:06:37 compute-0 nova_compute[183177]: 2026-01-26 20:06:37.573 183181 DEBUG nova.compute.manager [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 20:06:37 compute-0 nova_compute[183177]: 2026-01-26 20:06:37.574 183181 DEBUG nova.network.neutron [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 20:06:37 compute-0 nova_compute[183177]: 2026-01-26 20:06:37.575 183181 WARNING neutronclient.v2_0.client [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:06:37 compute-0 nova_compute[183177]: 2026-01-26 20:06:37.575 183181 WARNING neutronclient.v2_0.client [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:06:38 compute-0 nova_compute[183177]: 2026-01-26 20:06:38.084 183181 INFO nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 20:06:38 compute-0 nova_compute[183177]: 2026-01-26 20:06:38.345 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:38 compute-0 nova_compute[183177]: 2026-01-26 20:06:38.596 183181 DEBUG nova.compute.manager [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 20:06:39 compute-0 podman[214868]: 2026-01-26 20:06:39.39256537 +0000 UTC m=+0.123058194 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 26 20:06:39 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:39.405 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:06:39 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:39.406 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:06:39 compute-0 podman[214867]: 2026-01-26 20:06:39.40668099 +0000 UTC m=+0.139753153 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.425 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:39 compute-0 podman[214866]: 2026-01-26 20:06:39.451318322 +0000 UTC m=+0.192028450 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.588 183181 DEBUG nova.network.neutron [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Successfully created port: bf218032-dd19-417b-ad93-d29e2b451fce _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.619 183181 DEBUG nova.compute.manager [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.621 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.622 183181 INFO nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Creating image(s)
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.623 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.623 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.624 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.625 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.631 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.633 183181 DEBUG oslo_concurrency.processutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.723 183181 DEBUG oslo_concurrency.processutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.724 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.725 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.726 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.730 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.731 183181 DEBUG oslo_concurrency.processutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.796 183181 DEBUG oslo_concurrency.processutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.797 183181 DEBUG oslo_concurrency.processutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.841 183181 DEBUG oslo_concurrency.processutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.843 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.843 183181 DEBUG oslo_concurrency.processutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.902 183181 DEBUG oslo_concurrency.processutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.903 183181 DEBUG nova.virt.disk.api [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Checking if we can resize image /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.904 183181 DEBUG oslo_concurrency.processutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.962 183181 DEBUG oslo_concurrency.processutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.964 183181 DEBUG nova.virt.disk.api [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Cannot resize image /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.965 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.965 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Ensure instance console log exists: /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.966 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.967 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:39 compute-0 nova_compute[183177]: 2026-01-26 20:06:39.967 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:40 compute-0 nova_compute[183177]: 2026-01-26 20:06:40.658 183181 DEBUG nova.network.neutron [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Successfully updated port: bf218032-dd19-417b-ad93-d29e2b451fce _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 20:06:40 compute-0 nova_compute[183177]: 2026-01-26 20:06:40.734 183181 DEBUG nova.compute.manager [req-f4dc4d51-6466-422e-a401-571b6a36afc2 req-0a7e902e-4d61-4ae5-b206-fd396ec6d223 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-changed-bf218032-dd19-417b-ad93-d29e2b451fce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:06:40 compute-0 nova_compute[183177]: 2026-01-26 20:06:40.735 183181 DEBUG nova.compute.manager [req-f4dc4d51-6466-422e-a401-571b6a36afc2 req-0a7e902e-4d61-4ae5-b206-fd396ec6d223 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Refreshing instance network info cache due to event network-changed-bf218032-dd19-417b-ad93-d29e2b451fce. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:06:40 compute-0 nova_compute[183177]: 2026-01-26 20:06:40.735 183181 DEBUG oslo_concurrency.lockutils [req-f4dc4d51-6466-422e-a401-571b6a36afc2 req-0a7e902e-4d61-4ae5-b206-fd396ec6d223 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-a29d6012-19fb-48c7-864f-28ed4c715d89" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:06:40 compute-0 nova_compute[183177]: 2026-01-26 20:06:40.736 183181 DEBUG oslo_concurrency.lockutils [req-f4dc4d51-6466-422e-a401-571b6a36afc2 req-0a7e902e-4d61-4ae5-b206-fd396ec6d223 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-a29d6012-19fb-48c7-864f-28ed4c715d89" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:06:40 compute-0 nova_compute[183177]: 2026-01-26 20:06:40.736 183181 DEBUG nova.network.neutron [req-f4dc4d51-6466-422e-a401-571b6a36afc2 req-0a7e902e-4d61-4ae5-b206-fd396ec6d223 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Refreshing network info cache for port bf218032-dd19-417b-ad93-d29e2b451fce _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:06:41 compute-0 nova_compute[183177]: 2026-01-26 20:06:41.167 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "refresh_cache-a29d6012-19fb-48c7-864f-28ed4c715d89" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:06:41 compute-0 nova_compute[183177]: 2026-01-26 20:06:41.242 183181 WARNING neutronclient.v2_0.client [req-f4dc4d51-6466-422e-a401-571b6a36afc2 req-0a7e902e-4d61-4ae5-b206-fd396ec6d223 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:06:41 compute-0 nova_compute[183177]: 2026-01-26 20:06:41.406 183181 DEBUG nova.network.neutron [req-f4dc4d51-6466-422e-a401-571b6a36afc2 req-0a7e902e-4d61-4ae5-b206-fd396ec6d223 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:06:41 compute-0 nova_compute[183177]: 2026-01-26 20:06:41.437 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:41 compute-0 nova_compute[183177]: 2026-01-26 20:06:41.830 183181 DEBUG nova.network.neutron [req-f4dc4d51-6466-422e-a401-571b6a36afc2 req-0a7e902e-4d61-4ae5-b206-fd396ec6d223 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:06:42 compute-0 nova_compute[183177]: 2026-01-26 20:06:42.338 183181 DEBUG oslo_concurrency.lockutils [req-f4dc4d51-6466-422e-a401-571b6a36afc2 req-0a7e902e-4d61-4ae5-b206-fd396ec6d223 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-a29d6012-19fb-48c7-864f-28ed4c715d89" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:06:42 compute-0 nova_compute[183177]: 2026-01-26 20:06:42.339 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquired lock "refresh_cache-a29d6012-19fb-48c7-864f-28ed4c715d89" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:06:42 compute-0 nova_compute[183177]: 2026-01-26 20:06:42.339 183181 DEBUG nova.network.neutron [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 20:06:43 compute-0 nova_compute[183177]: 2026-01-26 20:06:43.364 183181 DEBUG nova.network.neutron [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:06:43 compute-0 nova_compute[183177]: 2026-01-26 20:06:43.378 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:43 compute-0 nova_compute[183177]: 2026-01-26 20:06:43.740 183181 WARNING neutronclient.v2_0.client [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.365 183181 DEBUG nova.network.neutron [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Updating instance_info_cache with network_info: [{"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.873 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Releasing lock "refresh_cache-a29d6012-19fb-48c7-864f-28ed4c715d89" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.873 183181 DEBUG nova.compute.manager [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Instance network_info: |[{"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.877 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Start _get_guest_xml network_info=[{"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.882 183181 WARNING nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.885 183181 DEBUG nova.virt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadStabilizationStrategy-server-489971786', uuid='a29d6012-19fb-48c7-864f-28ed4c715d89'), owner=OwnerMeta(userid='b3d5258d30ef4be39230c019f11bed8f', username='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin', projectid='8d30bc5631f24a6799364d53cb4e9465', projectname='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769458004.8852847) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.892 183181 DEBUG nova.virt.libvirt.host [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.893 183181 DEBUG nova.virt.libvirt.host [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.897 183181 DEBUG nova.virt.libvirt.host [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.898 183181 DEBUG nova.virt.libvirt.host [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.900 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.900 183181 DEBUG nova.virt.hardware [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.901 183181 DEBUG nova.virt.hardware [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.901 183181 DEBUG nova.virt.hardware [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.902 183181 DEBUG nova.virt.hardware [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.902 183181 DEBUG nova.virt.hardware [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.903 183181 DEBUG nova.virt.hardware [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.903 183181 DEBUG nova.virt.hardware [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.904 183181 DEBUG nova.virt.hardware [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.904 183181 DEBUG nova.virt.hardware [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.905 183181 DEBUG nova.virt.hardware [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.905 183181 DEBUG nova.virt.hardware [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.913 183181 DEBUG nova.virt.libvirt.vif [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:06:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-489971786',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-4899717',id=28,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-45j0gckj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:06:38Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=a29d6012-19fb-48c7-864f-28ed4c715d89,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.913 183181 DEBUG nova.network.os_vif_util [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converting VIF {"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.915 183181 DEBUG nova.network.os_vif_util [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:80:11,bridge_name='br-int',has_traffic_filtering=True,id=bf218032-dd19-417b-ad93-d29e2b451fce,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf218032-dd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:06:44 compute-0 nova_compute[183177]: 2026-01-26 20:06:44.916 183181 DEBUG nova.objects.instance [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lazy-loading 'pci_devices' on Instance uuid a29d6012-19fb-48c7-864f-28ed4c715d89 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.425 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] End _get_guest_xml xml=<domain type="kvm">
Jan 26 20:06:45 compute-0 nova_compute[183177]:   <uuid>a29d6012-19fb-48c7-864f-28ed4c715d89</uuid>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   <name>instance-0000001c</name>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-489971786</nova:name>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:06:44</nova:creationTime>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:06:45 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:06:45 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         <nova:port uuid="bf218032-dd19-417b-ad93-d29e2b451fce">
Jan 26 20:06:45 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <system>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <entry name="serial">a29d6012-19fb-48c7-864f-28ed4c715d89</entry>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <entry name="uuid">a29d6012-19fb-48c7-864f-28ed4c715d89</entry>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     </system>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   <os>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   </os>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   <features>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   </features>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk.config"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:eb:80:11"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <target dev="tapbf218032-dd"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     </interface>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/console.log" append="off"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <video>
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     </video>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:06:45 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:06:45 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:06:45 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:06:45 compute-0 nova_compute[183177]: </domain>
Jan 26 20:06:45 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.427 183181 DEBUG nova.compute.manager [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Preparing to wait for external event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.428 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.428 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.428 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.429 183181 DEBUG nova.virt.libvirt.vif [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:06:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-489971786',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-4899717',id=28,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-45j0gckj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:06:38Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=a29d6012-19fb-48c7-864f-28ed4c715d89,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.429 183181 DEBUG nova.network.os_vif_util [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converting VIF {"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.430 183181 DEBUG nova.network.os_vif_util [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:80:11,bridge_name='br-int',has_traffic_filtering=True,id=bf218032-dd19-417b-ad93-d29e2b451fce,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf218032-dd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.430 183181 DEBUG os_vif [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:80:11,bridge_name='br-int',has_traffic_filtering=True,id=bf218032-dd19-417b-ad93-d29e2b451fce,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf218032-dd') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.431 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.431 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.432 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.432 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.433 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd1549ba8-20f8-5e4c-b7a9-4b27dbb871aa', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.434 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.436 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.436 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.439 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.439 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbf218032-dd, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.439 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapbf218032-dd, col_values=(('qos', UUID('4f266747-330a-4177-981a-03c3cf44ea0c')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.440 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapbf218032-dd, col_values=(('external_ids', {'iface-id': 'bf218032-dd19-417b-ad93-d29e2b451fce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:80:11', 'vm-uuid': 'a29d6012-19fb-48c7-864f-28ed4c715d89'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.441 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:45 compute-0 NetworkManager[55489]: <info>  [1769458005.4425] manager: (tapbf218032-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.443 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.450 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:45 compute-0 nova_compute[183177]: 2026-01-26 20:06:45.451 183181 INFO os_vif [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:80:11,bridge_name='br-int',has_traffic_filtering=True,id=bf218032-dd19-417b-ad93-d29e2b451fce,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf218032-dd')
Jan 26 20:06:46 compute-0 podman[214949]: 2026-01-26 20:06:46.345063955 +0000 UTC m=+0.083393047 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 20:06:47 compute-0 nova_compute[183177]: 2026-01-26 20:06:47.117 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:06:47 compute-0 nova_compute[183177]: 2026-01-26 20:06:47.117 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:06:47 compute-0 nova_compute[183177]: 2026-01-26 20:06:47.118 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] No VIF found with MAC fa:16:3e:eb:80:11, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 20:06:47 compute-0 nova_compute[183177]: 2026-01-26 20:06:47.119 183181 INFO nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Using config drive
Jan 26 20:06:47 compute-0 nova_compute[183177]: 2026-01-26 20:06:47.787 183181 WARNING neutronclient.v2_0.client [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:06:48 compute-0 nova_compute[183177]: 2026-01-26 20:06:48.383 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.407 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:06:48 compute-0 nova_compute[183177]: 2026-01-26 20:06:48.508 183181 INFO nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Creating config drive at /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk.config
Jan 26 20:06:48 compute-0 nova_compute[183177]: 2026-01-26 20:06:48.521 183181 DEBUG oslo_concurrency.processutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpgqs58tin execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:06:48 compute-0 nova_compute[183177]: 2026-01-26 20:06:48.667 183181 DEBUG oslo_concurrency.processutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpgqs58tin" returned: 0 in 0.146s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:06:48 compute-0 kernel: tapbf218032-dd: entered promiscuous mode
Jan 26 20:06:48 compute-0 NetworkManager[55489]: <info>  [1769458008.7535] manager: (tapbf218032-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Jan 26 20:06:48 compute-0 ovn_controller[95396]: 2026-01-26T20:06:48Z|00214|binding|INFO|Claiming lport bf218032-dd19-417b-ad93-d29e2b451fce for this chassis.
Jan 26 20:06:48 compute-0 ovn_controller[95396]: 2026-01-26T20:06:48Z|00215|binding|INFO|bf218032-dd19-417b-ad93-d29e2b451fce: Claiming fa:16:3e:eb:80:11 10.100.0.10
Jan 26 20:06:48 compute-0 nova_compute[183177]: 2026-01-26 20:06:48.756 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.769 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:80:11 10.100.0.10'], port_security=['fa:16:3e:eb:80:11 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a29d6012-19fb-48c7-864f-28ed4c715d89', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d30bc5631f24a6799364d53cb4e9465', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2edde6da-001c-4935-bbf6-81cd72a69f91', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8598e857-e8d7-4ecf-97dd-17cf2e766e3a, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=bf218032-dd19-417b-ad93-d29e2b451fce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:06:48 compute-0 ovn_controller[95396]: 2026-01-26T20:06:48Z|00216|binding|INFO|Setting lport bf218032-dd19-417b-ad93-d29e2b451fce ovn-installed in OVS
Jan 26 20:06:48 compute-0 ovn_controller[95396]: 2026-01-26T20:06:48Z|00217|binding|INFO|Setting lport bf218032-dd19-417b-ad93-d29e2b451fce up in Southbound
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.771 104672 INFO neutron.agent.ovn.metadata.agent [-] Port bf218032-dd19-417b-ad93-d29e2b451fce in datapath d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c bound to our chassis
Jan 26 20:06:48 compute-0 nova_compute[183177]: 2026-01-26 20:06:48.772 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.773 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c
Jan 26 20:06:48 compute-0 nova_compute[183177]: 2026-01-26 20:06:48.791 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.806 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e80c5006-ae42-4fd0-8f32-04e929264334]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.807 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd9f030a0-21 in ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.810 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd9f030a0-20 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.810 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[c3f637b5-271c-459d-a118-747a483da7af]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.811 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[168a6810-829f-4955-a80d-b126cc56d5c5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:48 compute-0 systemd-udevd[214993]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:06:48 compute-0 systemd-machined[154465]: New machine qemu-21-instance-0000001c.
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.825 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[c85856e3-2a6d-4620-b62e-760edd234f83]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:48 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000001c.
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.847 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[c8dbc66a-1ac5-4d6e-ae8d-be32dde21b64]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:48 compute-0 NetworkManager[55489]: <info>  [1769458008.8499] device (tapbf218032-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 20:06:48 compute-0 NetworkManager[55489]: <info>  [1769458008.8515] device (tapbf218032-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.896 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[98b45fec-0aea-496c-9000-f719441a6158]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:48 compute-0 systemd-udevd[214996]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:06:48 compute-0 NetworkManager[55489]: <info>  [1769458008.9051] manager: (tapd9f030a0-20): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.904 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a2cc84-079f-4fdc-b4c3-84cc67695ab9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.954 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[88b196c7-bd8f-4840-9f23-7d003ae53850]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:48 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:48.958 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[a034f5ac-03ed-4e85-b994-aad59cc0da29]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:49 compute-0 NetworkManager[55489]: <info>  [1769458009.0025] device (tapd9f030a0-20): carrier: link connected
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.013 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[8148ce74-f5ee-47f9-91c8-bbc21d22bfeb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.032 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[08660f38-e9c4-4b38-aba5-e1e81f72b497]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f030a0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:c2:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546788, 'reachable_time': 38542, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215024, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.060 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[fc9279e5-ed45-4966-8e1d-c5faba88a5a4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:c2e0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546788, 'tstamp': 546788}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215027, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.085 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9375baca-99c0-4258-8dd1-806a368082c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f030a0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:c2:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546788, 'reachable_time': 38542, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215032, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.137 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6f53e6fa-eafe-4210-987d-2e7617f5fb9c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.224 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5d8c80-0918-4d70-a151-9ce6a2c1841d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.225 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f030a0-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.225 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.225 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9f030a0-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:06:49 compute-0 NetworkManager[55489]: <info>  [1769458009.2683] manager: (tapd9f030a0-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Jan 26 20:06:49 compute-0 nova_compute[183177]: 2026-01-26 20:06:49.268 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:49 compute-0 kernel: tapd9f030a0-20: entered promiscuous mode
Jan 26 20:06:49 compute-0 nova_compute[183177]: 2026-01-26 20:06:49.270 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.272 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9f030a0-20, col_values=(('external_ids', {'iface-id': '63338c40-169b-4962-a6c8-8ca20b375080'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:06:49 compute-0 nova_compute[183177]: 2026-01-26 20:06:49.273 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:49 compute-0 ovn_controller[95396]: 2026-01-26T20:06:49Z|00218|binding|INFO|Releasing lport 63338c40-169b-4962-a6c8-8ca20b375080 from this chassis (sb_readonly=0)
Jan 26 20:06:49 compute-0 nova_compute[183177]: 2026-01-26 20:06:49.300 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.302 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[deb2eaeb-2dd0-48a3-a7de-5dedde65d6c6]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.303 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.303 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.303 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.304 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.304 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb4be01-8a61-4eff-bc4c-33bc7d5375f4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.305 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.305 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6cbf2368-5205-40fa-9300-0006d8c4383a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.306 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: global
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 20:06:49 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:06:49.306 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'env', 'PROCESS_TAG=haproxy-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 20:06:49 compute-0 nova_compute[183177]: 2026-01-26 20:06:49.568 183181 DEBUG nova.compute.manager [req-4510a076-fec3-4d52-a5ee-fdae692a855a req-5e218b8a-b6c0-47b2-af75-bd595dda0930 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:06:49 compute-0 nova_compute[183177]: 2026-01-26 20:06:49.569 183181 DEBUG oslo_concurrency.lockutils [req-4510a076-fec3-4d52-a5ee-fdae692a855a req-5e218b8a-b6c0-47b2-af75-bd595dda0930 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:49 compute-0 nova_compute[183177]: 2026-01-26 20:06:49.570 183181 DEBUG oslo_concurrency.lockutils [req-4510a076-fec3-4d52-a5ee-fdae692a855a req-5e218b8a-b6c0-47b2-af75-bd595dda0930 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:49 compute-0 nova_compute[183177]: 2026-01-26 20:06:49.570 183181 DEBUG oslo_concurrency.lockutils [req-4510a076-fec3-4d52-a5ee-fdae692a855a req-5e218b8a-b6c0-47b2-af75-bd595dda0930 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:49 compute-0 nova_compute[183177]: 2026-01-26 20:06:49.570 183181 DEBUG nova.compute.manager [req-4510a076-fec3-4d52-a5ee-fdae692a855a req-5e218b8a-b6c0-47b2-af75-bd595dda0930 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Processing event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:06:49 compute-0 nova_compute[183177]: 2026-01-26 20:06:49.571 183181 DEBUG nova.compute.manager [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:06:49 compute-0 nova_compute[183177]: 2026-01-26 20:06:49.577 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 20:06:49 compute-0 nova_compute[183177]: 2026-01-26 20:06:49.581 183181 INFO nova.virt.libvirt.driver [-] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Instance spawned successfully.
Jan 26 20:06:49 compute-0 nova_compute[183177]: 2026-01-26 20:06:49.581 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 20:06:49 compute-0 podman[215065]: 2026-01-26 20:06:49.70807503 +0000 UTC m=+0.055185557 container create 429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:06:49 compute-0 systemd[1]: Started libpod-conmon-429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80.scope.
Jan 26 20:06:49 compute-0 podman[215065]: 2026-01-26 20:06:49.680430485 +0000 UTC m=+0.027541022 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 20:06:49 compute-0 systemd[1]: Started libcrun container.
Jan 26 20:06:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57606e01d76059b42d341c615673e01380e28eb8ad62b61df75b9137dd393735/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 20:06:49 compute-0 podman[215065]: 2026-01-26 20:06:49.801650508 +0000 UTC m=+0.148761045 container init 429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120)
Jan 26 20:06:49 compute-0 podman[215065]: 2026-01-26 20:06:49.814595247 +0000 UTC m=+0.161705764 container start 429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 20:06:49 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[215080]: [NOTICE]   (215084) : New worker (215086) forked
Jan 26 20:06:49 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[215080]: [NOTICE]   (215084) : Loading success.
Jan 26 20:06:50 compute-0 nova_compute[183177]: 2026-01-26 20:06:50.097 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:06:50 compute-0 nova_compute[183177]: 2026-01-26 20:06:50.098 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:06:50 compute-0 nova_compute[183177]: 2026-01-26 20:06:50.098 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:06:50 compute-0 nova_compute[183177]: 2026-01-26 20:06:50.099 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:06:50 compute-0 nova_compute[183177]: 2026-01-26 20:06:50.099 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:06:50 compute-0 nova_compute[183177]: 2026-01-26 20:06:50.099 183181 DEBUG nova.virt.libvirt.driver [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:06:50 compute-0 nova_compute[183177]: 2026-01-26 20:06:50.499 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:50 compute-0 nova_compute[183177]: 2026-01-26 20:06:50.607 183181 INFO nova.compute.manager [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Took 10.99 seconds to spawn the instance on the hypervisor.
Jan 26 20:06:50 compute-0 nova_compute[183177]: 2026-01-26 20:06:50.608 183181 DEBUG nova.compute.manager [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 20:06:51 compute-0 nova_compute[183177]: 2026-01-26 20:06:51.151 183181 INFO nova.compute.manager [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Took 16.25 seconds to build instance.
Jan 26 20:06:51 compute-0 nova_compute[183177]: 2026-01-26 20:06:51.661 183181 DEBUG oslo_concurrency.lockutils [None req-c325d962-36fb-42db-ac35-ec49e5fd2570 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.779s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:51 compute-0 nova_compute[183177]: 2026-01-26 20:06:51.664 183181 DEBUG nova.compute.manager [req-317b25e5-a946-4a2d-b0c8-20df7049b8ce req-059f4ebe-2ade-47f6-9fb8-45d3588af625 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:06:51 compute-0 nova_compute[183177]: 2026-01-26 20:06:51.665 183181 DEBUG oslo_concurrency.lockutils [req-317b25e5-a946-4a2d-b0c8-20df7049b8ce req-059f4ebe-2ade-47f6-9fb8-45d3588af625 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:51 compute-0 nova_compute[183177]: 2026-01-26 20:06:51.665 183181 DEBUG oslo_concurrency.lockutils [req-317b25e5-a946-4a2d-b0c8-20df7049b8ce req-059f4ebe-2ade-47f6-9fb8-45d3588af625 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:51 compute-0 nova_compute[183177]: 2026-01-26 20:06:51.665 183181 DEBUG oslo_concurrency.lockutils [req-317b25e5-a946-4a2d-b0c8-20df7049b8ce req-059f4ebe-2ade-47f6-9fb8-45d3588af625 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:51 compute-0 nova_compute[183177]: 2026-01-26 20:06:51.665 183181 DEBUG nova.compute.manager [req-317b25e5-a946-4a2d-b0c8-20df7049b8ce req-059f4ebe-2ade-47f6-9fb8-45d3588af625 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] No waiting events found dispatching network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:06:51 compute-0 nova_compute[183177]: 2026-01-26 20:06:51.666 183181 WARNING nova.compute.manager [req-317b25e5-a946-4a2d-b0c8-20df7049b8ce req-059f4ebe-2ade-47f6-9fb8-45d3588af625 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received unexpected event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce for instance with vm_state active and task_state None.
Jan 26 20:06:53 compute-0 nova_compute[183177]: 2026-01-26 20:06:53.384 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:55 compute-0 nova_compute[183177]: 2026-01-26 20:06:55.025 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:55 compute-0 nova_compute[183177]: 2026-01-26 20:06:55.026 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:55 compute-0 sshd-session[215096]: Connection closed by authenticating user root 188.166.116.149 port 52226 [preauth]
Jan 26 20:06:55 compute-0 nova_compute[183177]: 2026-01-26 20:06:55.502 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:55 compute-0 nova_compute[183177]: 2026-01-26 20:06:55.532 183181 DEBUG nova.compute.manager [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 20:06:56 compute-0 nova_compute[183177]: 2026-01-26 20:06:56.094 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:06:56 compute-0 nova_compute[183177]: 2026-01-26 20:06:56.095 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:06:56 compute-0 nova_compute[183177]: 2026-01-26 20:06:56.102 183181 DEBUG nova.virt.hardware [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 20:06:56 compute-0 nova_compute[183177]: 2026-01-26 20:06:56.103 183181 INFO nova.compute.claims [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Claim successful on node compute-0.ctlplane.example.com
Jan 26 20:06:57 compute-0 nova_compute[183177]: 2026-01-26 20:06:57.203 183181 DEBUG nova.compute.provider_tree [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:06:57 compute-0 nova_compute[183177]: 2026-01-26 20:06:57.713 183181 DEBUG nova.scheduler.client.report [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:06:58 compute-0 nova_compute[183177]: 2026-01-26 20:06:58.225 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.130s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:06:58 compute-0 nova_compute[183177]: 2026-01-26 20:06:58.226 183181 DEBUG nova.compute.manager [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 20:06:58 compute-0 nova_compute[183177]: 2026-01-26 20:06:58.386 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:06:58 compute-0 nova_compute[183177]: 2026-01-26 20:06:58.741 183181 DEBUG nova.compute.manager [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 20:06:58 compute-0 nova_compute[183177]: 2026-01-26 20:06:58.742 183181 DEBUG nova.network.neutron [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 20:06:58 compute-0 nova_compute[183177]: 2026-01-26 20:06:58.743 183181 WARNING neutronclient.v2_0.client [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:06:58 compute-0 nova_compute[183177]: 2026-01-26 20:06:58.744 183181 WARNING neutronclient.v2_0.client [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:06:59 compute-0 nova_compute[183177]: 2026-01-26 20:06:59.252 183181 INFO nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 20:06:59 compute-0 nova_compute[183177]: 2026-01-26 20:06:59.292 183181 DEBUG nova.network.neutron [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Successfully created port: 9fc4022a-3a46-4857-8c00-814af094ef10 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 20:06:59 compute-0 podman[192499]: time="2026-01-26T20:06:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:06:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:06:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:06:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:06:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2646 "" "Go-http-client/1.1"
Jan 26 20:06:59 compute-0 nova_compute[183177]: 2026-01-26 20:06:59.762 183181 DEBUG nova.compute.manager [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.550 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.784 183181 DEBUG nova.compute.manager [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.786 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.787 183181 INFO nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Creating image(s)
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.788 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.788 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.789 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.790 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.797 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.799 183181 DEBUG oslo_concurrency.processutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.890 183181 DEBUG oslo_concurrency.processutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.892 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.893 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.894 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.898 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.899 183181 DEBUG oslo_concurrency.processutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.961 183181 DEBUG oslo_concurrency.processutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.962 183181 DEBUG oslo_concurrency.processutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.996 183181 DEBUG oslo_concurrency.processutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.997 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:00 compute-0 nova_compute[183177]: 2026-01-26 20:07:00.997 183181 DEBUG oslo_concurrency.processutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:01 compute-0 nova_compute[183177]: 2026-01-26 20:07:01.048 183181 DEBUG oslo_concurrency.processutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:01 compute-0 nova_compute[183177]: 2026-01-26 20:07:01.050 183181 DEBUG nova.virt.disk.api [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Checking if we can resize image /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 20:07:01 compute-0 nova_compute[183177]: 2026-01-26 20:07:01.050 183181 DEBUG oslo_concurrency.processutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:01 compute-0 nova_compute[183177]: 2026-01-26 20:07:01.101 183181 DEBUG oslo_concurrency.processutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:01 compute-0 nova_compute[183177]: 2026-01-26 20:07:01.103 183181 DEBUG nova.virt.disk.api [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Cannot resize image /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 20:07:01 compute-0 nova_compute[183177]: 2026-01-26 20:07:01.104 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 20:07:01 compute-0 nova_compute[183177]: 2026-01-26 20:07:01.104 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Ensure instance console log exists: /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 20:07:01 compute-0 nova_compute[183177]: 2026-01-26 20:07:01.105 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:07:01 compute-0 nova_compute[183177]: 2026-01-26 20:07:01.106 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:07:01 compute-0 nova_compute[183177]: 2026-01-26 20:07:01.106 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:01 compute-0 openstack_network_exporter[195363]: ERROR   20:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:07:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:07:01 compute-0 openstack_network_exporter[195363]: ERROR   20:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:07:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:07:02 compute-0 ovn_controller[95396]: 2026-01-26T20:07:02Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:eb:80:11 10.100.0.10
Jan 26 20:07:02 compute-0 ovn_controller[95396]: 2026-01-26T20:07:02Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:eb:80:11 10.100.0.10
Jan 26 20:07:02 compute-0 nova_compute[183177]: 2026-01-26 20:07:02.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:07:02 compute-0 nova_compute[183177]: 2026-01-26 20:07:02.313 183181 DEBUG nova.network.neutron [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Successfully updated port: 9fc4022a-3a46-4857-8c00-814af094ef10 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 20:07:02 compute-0 nova_compute[183177]: 2026-01-26 20:07:02.378 183181 DEBUG nova.compute.manager [req-e1cbc899-6e5e-4f79-8dd0-963884138a7e req-e123f997-b4b0-4ede-adf8-b13a94f68542 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received event network-changed-9fc4022a-3a46-4857-8c00-814af094ef10 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:07:02 compute-0 nova_compute[183177]: 2026-01-26 20:07:02.379 183181 DEBUG nova.compute.manager [req-e1cbc899-6e5e-4f79-8dd0-963884138a7e req-e123f997-b4b0-4ede-adf8-b13a94f68542 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Refreshing instance network info cache due to event network-changed-9fc4022a-3a46-4857-8c00-814af094ef10. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:07:02 compute-0 nova_compute[183177]: 2026-01-26 20:07:02.380 183181 DEBUG oslo_concurrency.lockutils [req-e1cbc899-6e5e-4f79-8dd0-963884138a7e req-e123f997-b4b0-4ede-adf8-b13a94f68542 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-33cfa3c1-e4a7-4cc7-98f9-c638d97608f5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:07:02 compute-0 nova_compute[183177]: 2026-01-26 20:07:02.380 183181 DEBUG oslo_concurrency.lockutils [req-e1cbc899-6e5e-4f79-8dd0-963884138a7e req-e123f997-b4b0-4ede-adf8-b13a94f68542 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-33cfa3c1-e4a7-4cc7-98f9-c638d97608f5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:07:02 compute-0 nova_compute[183177]: 2026-01-26 20:07:02.380 183181 DEBUG nova.network.neutron [req-e1cbc899-6e5e-4f79-8dd0-963884138a7e req-e123f997-b4b0-4ede-adf8-b13a94f68542 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Refreshing network info cache for port 9fc4022a-3a46-4857-8c00-814af094ef10 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:07:02 compute-0 nova_compute[183177]: 2026-01-26 20:07:02.819 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "refresh_cache-33cfa3c1-e4a7-4cc7-98f9-c638d97608f5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:07:02 compute-0 nova_compute[183177]: 2026-01-26 20:07:02.887 183181 WARNING neutronclient.v2_0.client [req-e1cbc899-6e5e-4f79-8dd0-963884138a7e req-e123f997-b4b0-4ede-adf8-b13a94f68542 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:07:03 compute-0 nova_compute[183177]: 2026-01-26 20:07:03.423 183181 DEBUG nova.network.neutron [req-e1cbc899-6e5e-4f79-8dd0-963884138a7e req-e123f997-b4b0-4ede-adf8-b13a94f68542 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:07:03 compute-0 nova_compute[183177]: 2026-01-26 20:07:03.427 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:03 compute-0 nova_compute[183177]: 2026-01-26 20:07:03.613 183181 DEBUG nova.network.neutron [req-e1cbc899-6e5e-4f79-8dd0-963884138a7e req-e123f997-b4b0-4ede-adf8-b13a94f68542 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:07:04 compute-0 nova_compute[183177]: 2026-01-26 20:07:04.123 183181 DEBUG oslo_concurrency.lockutils [req-e1cbc899-6e5e-4f79-8dd0-963884138a7e req-e123f997-b4b0-4ede-adf8-b13a94f68542 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-33cfa3c1-e4a7-4cc7-98f9-c638d97608f5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:07:04 compute-0 nova_compute[183177]: 2026-01-26 20:07:04.124 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquired lock "refresh_cache-33cfa3c1-e4a7-4cc7-98f9-c638d97608f5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:07:04 compute-0 nova_compute[183177]: 2026-01-26 20:07:04.124 183181 DEBUG nova.network.neutron [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 20:07:05 compute-0 nova_compute[183177]: 2026-01-26 20:07:05.408 183181 DEBUG nova.network.neutron [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:07:05 compute-0 nova_compute[183177]: 2026-01-26 20:07:05.611 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:05 compute-0 nova_compute[183177]: 2026-01-26 20:07:05.696 183181 WARNING neutronclient.v2_0.client [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:07:06 compute-0 nova_compute[183177]: 2026-01-26 20:07:06.524 183181 DEBUG nova.network.neutron [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Updating instance_info_cache with network_info: [{"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.031 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Releasing lock "refresh_cache-33cfa3c1-e4a7-4cc7-98f9-c638d97608f5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.032 183181 DEBUG nova.compute.manager [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Instance network_info: |[{"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.036 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Start _get_guest_xml network_info=[{"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.042 183181 WARNING nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.043 183181 DEBUG nova.virt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadStabilizationStrategy-server-781413564', uuid='33cfa3c1-e4a7-4cc7-98f9-c638d97608f5'), owner=OwnerMeta(userid='b3d5258d30ef4be39230c019f11bed8f', username='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin', projectid='8d30bc5631f24a6799364d53cb4e9465', projectname='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769458027.0438485) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.051 183181 DEBUG nova.virt.libvirt.host [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.051 183181 DEBUG nova.virt.libvirt.host [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.054 183181 DEBUG nova.virt.libvirt.host [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.055 183181 DEBUG nova.virt.libvirt.host [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.056 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.056 183181 DEBUG nova.virt.hardware [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.056 183181 DEBUG nova.virt.hardware [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.056 183181 DEBUG nova.virt.hardware [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.057 183181 DEBUG nova.virt.hardware [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.057 183181 DEBUG nova.virt.hardware [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.057 183181 DEBUG nova.virt.hardware [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.057 183181 DEBUG nova.virt.hardware [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.057 183181 DEBUG nova.virt.hardware [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.057 183181 DEBUG nova.virt.hardware [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.058 183181 DEBUG nova.virt.hardware [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.058 183181 DEBUG nova.virt.hardware [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.061 183181 DEBUG nova.virt.libvirt.vif [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:06:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-781413564',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-7814135',id=29,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-lon7s37m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:06:59Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=33cfa3c1-e4a7-4cc7-98f9-c638d97608f5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.061 183181 DEBUG nova.network.os_vif_util [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converting VIF {"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.062 183181 DEBUG nova.network.os_vif_util [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:84:2b,bridge_name='br-int',has_traffic_filtering=True,id=9fc4022a-3a46-4857-8c00-814af094ef10,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc4022a-3a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.062 183181 DEBUG nova.objects.instance [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lazy-loading 'pci_devices' on Instance uuid 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.571 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] End _get_guest_xml xml=<domain type="kvm">
Jan 26 20:07:07 compute-0 nova_compute[183177]:   <uuid>33cfa3c1-e4a7-4cc7-98f9-c638d97608f5</uuid>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   <name>instance-0000001d</name>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-781413564</nova:name>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:07:07</nova:creationTime>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:07:07 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:07:07 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         <nova:port uuid="9fc4022a-3a46-4857-8c00-814af094ef10">
Jan 26 20:07:07 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <system>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <entry name="serial">33cfa3c1-e4a7-4cc7-98f9-c638d97608f5</entry>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <entry name="uuid">33cfa3c1-e4a7-4cc7-98f9-c638d97608f5</entry>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     </system>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   <os>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   </os>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   <features>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   </features>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk.config"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:03:84:2b"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <target dev="tap9fc4022a-3a"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     </interface>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/console.log" append="off"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <video>
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     </video>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:07:07 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:07:07 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:07:07 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:07:07 compute-0 nova_compute[183177]: </domain>
Jan 26 20:07:07 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.572 183181 DEBUG nova.compute.manager [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Preparing to wait for external event network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.572 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Acquiring lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.573 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.573 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.573 183181 DEBUG nova.virt.libvirt.vif [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:06:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-781413564',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-7814135',id=29,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-lon7s37m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:06:59Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=33cfa3c1-e4a7-4cc7-98f9-c638d97608f5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.574 183181 DEBUG nova.network.os_vif_util [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converting VIF {"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.574 183181 DEBUG nova.network.os_vif_util [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:84:2b,bridge_name='br-int',has_traffic_filtering=True,id=9fc4022a-3a46-4857-8c00-814af094ef10,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc4022a-3a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.574 183181 DEBUG os_vif [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:84:2b,bridge_name='br-int',has_traffic_filtering=True,id=9fc4022a-3a46-4857-8c00-814af094ef10,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc4022a-3a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.575 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.575 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.575 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.576 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.576 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '28dfac9a-2dcb-5b6d-a402-e61f11f180f5', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.577 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.579 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.582 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.582 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9fc4022a-3a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.583 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap9fc4022a-3a, col_values=(('qos', UUID('12b910fc-1b46-4f38-bfc1-a0f02ebd1eae')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.583 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap9fc4022a-3a, col_values=(('external_ids', {'iface-id': '9fc4022a-3a46-4857-8c00-814af094ef10', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:84:2b', 'vm-uuid': '33cfa3c1-e4a7-4cc7-98f9-c638d97608f5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.585 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:07 compute-0 NetworkManager[55489]: <info>  [1769458027.5866] manager: (tap9fc4022a-3a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.588 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.595 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:07 compute-0 nova_compute[183177]: 2026-01-26 20:07:07.596 183181 INFO os_vif [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:84:2b,bridge_name='br-int',has_traffic_filtering=True,id=9fc4022a-3a46-4857-8c00-814af094ef10,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc4022a-3a')
Jan 26 20:07:08 compute-0 nova_compute[183177]: 2026-01-26 20:07:08.427 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:09 compute-0 nova_compute[183177]: 2026-01-26 20:07:09.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:07:09 compute-0 nova_compute[183177]: 2026-01-26 20:07:09.160 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:07:09 compute-0 nova_compute[183177]: 2026-01-26 20:07:09.160 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:07:09 compute-0 nova_compute[183177]: 2026-01-26 20:07:09.161 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] No VIF found with MAC fa:16:3e:03:84:2b, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 20:07:09 compute-0 nova_compute[183177]: 2026-01-26 20:07:09.161 183181 INFO nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Using config drive
Jan 26 20:07:09 compute-0 nova_compute[183177]: 2026-01-26 20:07:09.678 183181 WARNING neutronclient.v2_0.client [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:07:10 compute-0 nova_compute[183177]: 2026-01-26 20:07:10.316 183181 INFO nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Creating config drive at /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk.config
Jan 26 20:07:10 compute-0 nova_compute[183177]: 2026-01-26 20:07:10.329 183181 DEBUG oslo_concurrency.processutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpy709_05v execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:10 compute-0 podman[215126]: 2026-01-26 20:07:10.361941004 +0000 UTC m=+0.098510794 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, name=ubi9-minimal, release=1755695350, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 20:07:10 compute-0 podman[215127]: 2026-01-26 20:07:10.368263843 +0000 UTC m=+0.096670123 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 26 20:07:10 compute-0 podman[215125]: 2026-01-26 20:07:10.404404786 +0000 UTC m=+0.143550464 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, io.buildah.version=1.41.4)
Jan 26 20:07:10 compute-0 nova_compute[183177]: 2026-01-26 20:07:10.466 183181 DEBUG oslo_concurrency.processutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpy709_05v" returned: 0 in 0.137s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:10 compute-0 kernel: tap9fc4022a-3a: entered promiscuous mode
Jan 26 20:07:10 compute-0 NetworkManager[55489]: <info>  [1769458030.5311] manager: (tap9fc4022a-3a): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Jan 26 20:07:10 compute-0 ovn_controller[95396]: 2026-01-26T20:07:10Z|00219|binding|INFO|Claiming lport 9fc4022a-3a46-4857-8c00-814af094ef10 for this chassis.
Jan 26 20:07:10 compute-0 ovn_controller[95396]: 2026-01-26T20:07:10Z|00220|binding|INFO|9fc4022a-3a46-4857-8c00-814af094ef10: Claiming fa:16:3e:03:84:2b 10.100.0.7
Jan 26 20:07:10 compute-0 nova_compute[183177]: 2026-01-26 20:07:10.557 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.567 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:84:2b 10.100.0.7'], port_security=['fa:16:3e:03:84:2b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '33cfa3c1-e4a7-4cc7-98f9-c638d97608f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d30bc5631f24a6799364d53cb4e9465', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2edde6da-001c-4935-bbf6-81cd72a69f91', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8598e857-e8d7-4ecf-97dd-17cf2e766e3a, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=9fc4022a-3a46-4857-8c00-814af094ef10) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.568 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 9fc4022a-3a46-4857-8c00-814af094ef10 in datapath d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c bound to our chassis
Jan 26 20:07:10 compute-0 systemd-udevd[215203]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.569 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c
Jan 26 20:07:10 compute-0 ovn_controller[95396]: 2026-01-26T20:07:10Z|00221|binding|INFO|Setting lport 9fc4022a-3a46-4857-8c00-814af094ef10 ovn-installed in OVS
Jan 26 20:07:10 compute-0 ovn_controller[95396]: 2026-01-26T20:07:10Z|00222|binding|INFO|Setting lport 9fc4022a-3a46-4857-8c00-814af094ef10 up in Southbound
Jan 26 20:07:10 compute-0 nova_compute[183177]: 2026-01-26 20:07:10.572 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:10 compute-0 NetworkManager[55489]: <info>  [1769458030.5863] device (tap9fc4022a-3a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 20:07:10 compute-0 NetworkManager[55489]: <info>  [1769458030.5870] device (tap9fc4022a-3a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.589 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[31ac8bb1-dcc3-45bb-ac4e-0068d1fcf42e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:07:10 compute-0 systemd-machined[154465]: New machine qemu-22-instance-0000001d.
Jan 26 20:07:10 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000001d.
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.615 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[c4a65f04-b0ab-46f9-b826-11ae275341e1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.618 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[105fe9af-8cbd-4338-b10c-e7cb43353969]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.638 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[3ffcba2c-3d26-4ac0-a573-8c12f0b2650d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.662 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[1ac5936e-2b59-47f9-b0ea-063167d68a1e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f030a0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:c2:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546788, 'reachable_time': 38542, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215217, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.679 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2e721871-30eb-4d67-a6dc-9adb53fdcd5b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd9f030a0-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546805, 'tstamp': 546805}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215219, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd9f030a0-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546810, 'tstamp': 546810}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215219, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.680 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f030a0-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:07:10 compute-0 nova_compute[183177]: 2026-01-26 20:07:10.682 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:10 compute-0 nova_compute[183177]: 2026-01-26 20:07:10.682 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.683 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9f030a0-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.683 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.683 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9f030a0-20, col_values=(('external_ids', {'iface-id': '63338c40-169b-4962-a6c8-8ca20b375080'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.683 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:07:10 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:10.684 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[07306bda-a30f-4670-b7dd-8709df929e38]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:07:11 compute-0 sshd-session[215221]: Connection closed by authenticating user root 142.93.140.142 port 44774 [preauth]
Jan 26 20:07:11 compute-0 nova_compute[183177]: 2026-01-26 20:07:11.483 183181 DEBUG nova.compute.manager [req-1a3d9c64-3422-43ab-be02-df5a1f93ca6d req-7148a9db-e635-46c4-8a2b-992461d0fa03 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received event network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:07:11 compute-0 nova_compute[183177]: 2026-01-26 20:07:11.483 183181 DEBUG oslo_concurrency.lockutils [req-1a3d9c64-3422-43ab-be02-df5a1f93ca6d req-7148a9db-e635-46c4-8a2b-992461d0fa03 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:07:11 compute-0 nova_compute[183177]: 2026-01-26 20:07:11.484 183181 DEBUG oslo_concurrency.lockutils [req-1a3d9c64-3422-43ab-be02-df5a1f93ca6d req-7148a9db-e635-46c4-8a2b-992461d0fa03 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:07:11 compute-0 nova_compute[183177]: 2026-01-26 20:07:11.484 183181 DEBUG oslo_concurrency.lockutils [req-1a3d9c64-3422-43ab-be02-df5a1f93ca6d req-7148a9db-e635-46c4-8a2b-992461d0fa03 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:11 compute-0 nova_compute[183177]: 2026-01-26 20:07:11.484 183181 DEBUG nova.compute.manager [req-1a3d9c64-3422-43ab-be02-df5a1f93ca6d req-7148a9db-e635-46c4-8a2b-992461d0fa03 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Processing event network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:07:11 compute-0 nova_compute[183177]: 2026-01-26 20:07:11.486 183181 DEBUG nova.compute.manager [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:07:11 compute-0 nova_compute[183177]: 2026-01-26 20:07:11.490 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 20:07:11 compute-0 nova_compute[183177]: 2026-01-26 20:07:11.494 183181 INFO nova.virt.libvirt.driver [-] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Instance spawned successfully.
Jan 26 20:07:11 compute-0 nova_compute[183177]: 2026-01-26 20:07:11.495 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 20:07:12 compute-0 nova_compute[183177]: 2026-01-26 20:07:12.009 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:07:12 compute-0 nova_compute[183177]: 2026-01-26 20:07:12.010 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:07:12 compute-0 nova_compute[183177]: 2026-01-26 20:07:12.010 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:07:12 compute-0 nova_compute[183177]: 2026-01-26 20:07:12.011 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:07:12 compute-0 nova_compute[183177]: 2026-01-26 20:07:12.011 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:07:12 compute-0 nova_compute[183177]: 2026-01-26 20:07:12.012 183181 DEBUG nova.virt.libvirt.driver [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:07:12 compute-0 nova_compute[183177]: 2026-01-26 20:07:12.526 183181 INFO nova.compute.manager [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Took 11.74 seconds to spawn the instance on the hypervisor.
Jan 26 20:07:12 compute-0 nova_compute[183177]: 2026-01-26 20:07:12.527 183181 DEBUG nova.compute.manager [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 20:07:12 compute-0 nova_compute[183177]: 2026-01-26 20:07:12.586 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:13 compute-0 nova_compute[183177]: 2026-01-26 20:07:13.061 183181 INFO nova.compute.manager [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Took 17.02 seconds to build instance.
Jan 26 20:07:13 compute-0 nova_compute[183177]: 2026-01-26 20:07:13.148 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:07:13 compute-0 nova_compute[183177]: 2026-01-26 20:07:13.429 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:13 compute-0 nova_compute[183177]: 2026-01-26 20:07:13.556 183181 DEBUG nova.compute.manager [req-982fc991-5063-4812-83f4-2b89c35efc7a req-97089fe6-4b98-408c-bb95-899b35fed404 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received event network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:07:13 compute-0 nova_compute[183177]: 2026-01-26 20:07:13.557 183181 DEBUG oslo_concurrency.lockutils [req-982fc991-5063-4812-83f4-2b89c35efc7a req-97089fe6-4b98-408c-bb95-899b35fed404 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:07:13 compute-0 nova_compute[183177]: 2026-01-26 20:07:13.557 183181 DEBUG oslo_concurrency.lockutils [req-982fc991-5063-4812-83f4-2b89c35efc7a req-97089fe6-4b98-408c-bb95-899b35fed404 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:07:13 compute-0 nova_compute[183177]: 2026-01-26 20:07:13.557 183181 DEBUG oslo_concurrency.lockutils [req-982fc991-5063-4812-83f4-2b89c35efc7a req-97089fe6-4b98-408c-bb95-899b35fed404 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:13 compute-0 nova_compute[183177]: 2026-01-26 20:07:13.557 183181 DEBUG nova.compute.manager [req-982fc991-5063-4812-83f4-2b89c35efc7a req-97089fe6-4b98-408c-bb95-899b35fed404 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] No waiting events found dispatching network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:07:13 compute-0 nova_compute[183177]: 2026-01-26 20:07:13.558 183181 WARNING nova.compute.manager [req-982fc991-5063-4812-83f4-2b89c35efc7a req-97089fe6-4b98-408c-bb95-899b35fed404 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received unexpected event network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 for instance with vm_state active and task_state None.
Jan 26 20:07:13 compute-0 nova_compute[183177]: 2026-01-26 20:07:13.571 183181 DEBUG oslo_concurrency.lockutils [None req-7fca375f-b21a-4705-a250-7fe5b4008832 b3d5258d30ef4be39230c019f11bed8f 8d30bc5631f24a6799364d53cb4e9465 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.545s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:14 compute-0 nova_compute[183177]: 2026-01-26 20:07:14.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:07:14 compute-0 nova_compute[183177]: 2026-01-26 20:07:14.671 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:07:14 compute-0 nova_compute[183177]: 2026-01-26 20:07:14.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:07:14 compute-0 nova_compute[183177]: 2026-01-26 20:07:14.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:14 compute-0 nova_compute[183177]: 2026-01-26 20:07:14.675 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:07:15 compute-0 nova_compute[183177]: 2026-01-26 20:07:15.732 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:15 compute-0 nova_compute[183177]: 2026-01-26 20:07:15.830 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:15 compute-0 nova_compute[183177]: 2026-01-26 20:07:15.832 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:15 compute-0 nova_compute[183177]: 2026-01-26 20:07:15.900 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:15 compute-0 nova_compute[183177]: 2026-01-26 20:07:15.910 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:15 compute-0 nova_compute[183177]: 2026-01-26 20:07:15.964 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:15 compute-0 nova_compute[183177]: 2026-01-26 20:07:15.965 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:16 compute-0 nova_compute[183177]: 2026-01-26 20:07:16.022 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:16 compute-0 nova_compute[183177]: 2026-01-26 20:07:16.253 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:07:16 compute-0 nova_compute[183177]: 2026-01-26 20:07:16.256 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:16 compute-0 nova_compute[183177]: 2026-01-26 20:07:16.284 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:16 compute-0 nova_compute[183177]: 2026-01-26 20:07:16.285 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5425MB free_disk=73.06868743896484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:07:16 compute-0 nova_compute[183177]: 2026-01-26 20:07:16.285 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:07:16 compute-0 nova_compute[183177]: 2026-01-26 20:07:16.286 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:07:17 compute-0 nova_compute[183177]: 2026-01-26 20:07:17.351 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance a29d6012-19fb-48c7-864f-28ed4c715d89 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 20:07:17 compute-0 nova_compute[183177]: 2026-01-26 20:07:17.353 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 20:07:17 compute-0 nova_compute[183177]: 2026-01-26 20:07:17.354 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:07:17 compute-0 nova_compute[183177]: 2026-01-26 20:07:17.355 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:07:16 up  1:31,  0 user,  load average: 0.46, 0.32, 0.29\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_8d30bc5631f24a6799364d53cb4e9465': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:07:17 compute-0 podman[215245]: 2026-01-26 20:07:17.363529159 +0000 UTC m=+0.096405155 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 20:07:17 compute-0 nova_compute[183177]: 2026-01-26 20:07:17.388 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing inventories for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 20:07:17 compute-0 nova_compute[183177]: 2026-01-26 20:07:17.431 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating ProviderTree inventory for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 20:07:17 compute-0 nova_compute[183177]: 2026-01-26 20:07:17.432 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 20:07:17 compute-0 nova_compute[183177]: 2026-01-26 20:07:17.468 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing aggregate associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 20:07:17 compute-0 nova_compute[183177]: 2026-01-26 20:07:17.558 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing trait associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_SCSI,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 20:07:17 compute-0 nova_compute[183177]: 2026-01-26 20:07:17.589 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:17 compute-0 nova_compute[183177]: 2026-01-26 20:07:17.597 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:07:18 compute-0 nova_compute[183177]: 2026-01-26 20:07:18.105 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:07:18 compute-0 nova_compute[183177]: 2026-01-26 20:07:18.432 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:18 compute-0 nova_compute[183177]: 2026-01-26 20:07:18.619 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:07:18 compute-0 nova_compute[183177]: 2026-01-26 20:07:18.620 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.334s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:19 compute-0 nova_compute[183177]: 2026-01-26 20:07:19.622 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:07:19 compute-0 nova_compute[183177]: 2026-01-26 20:07:19.623 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:07:19 compute-0 nova_compute[183177]: 2026-01-26 20:07:19.624 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:07:19 compute-0 nova_compute[183177]: 2026-01-26 20:07:19.624 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:07:21 compute-0 nova_compute[183177]: 2026-01-26 20:07:21.155 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:07:22 compute-0 nova_compute[183177]: 2026-01-26 20:07:22.591 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:23 compute-0 nova_compute[183177]: 2026-01-26 20:07:23.434 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:24.099 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:07:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:24.100 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:07:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:07:24.101 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:24 compute-0 ovn_controller[95396]: 2026-01-26T20:07:24Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:03:84:2b 10.100.0.7
Jan 26 20:07:24 compute-0 ovn_controller[95396]: 2026-01-26T20:07:24Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:03:84:2b 10.100.0.7
Jan 26 20:07:27 compute-0 nova_compute[183177]: 2026-01-26 20:07:27.595 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:28 compute-0 nova_compute[183177]: 2026-01-26 20:07:28.436 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:29 compute-0 podman[192499]: time="2026-01-26T20:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:07:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:07:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2644 "" "Go-http-client/1.1"
Jan 26 20:07:31 compute-0 openstack_network_exporter[195363]: ERROR   20:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:07:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:07:31 compute-0 openstack_network_exporter[195363]: ERROR   20:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:07:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:07:32 compute-0 nova_compute[183177]: 2026-01-26 20:07:32.598 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:33 compute-0 nova_compute[183177]: 2026-01-26 20:07:33.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:07:33 compute-0 nova_compute[183177]: 2026-01-26 20:07:33.153 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:07:33 compute-0 nova_compute[183177]: 2026-01-26 20:07:33.154 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:07:33 compute-0 nova_compute[183177]: 2026-01-26 20:07:33.155 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:33 compute-0 nova_compute[183177]: 2026-01-26 20:07:33.156 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:07:33 compute-0 nova_compute[183177]: 2026-01-26 20:07:33.156 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:07:33 compute-0 nova_compute[183177]: 2026-01-26 20:07:33.157 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:33 compute-0 nova_compute[183177]: 2026-01-26 20:07:33.439 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.177 183181 DEBUG nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:314
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.178 183181 DEBUG nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Image id 34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5 yields fingerprint 46c3eb6c81594e4d1392b4ffb0ccf21e15333434 _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:319
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.179 183181 INFO nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] image 34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5 at (/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434): checking
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.179 183181 DEBUG nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] image 34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5 at (/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434): image is in use _mark_in_use /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:279
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.180 183181 DEBUG nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:319
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.181 183181 DEBUG nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] a29d6012-19fb-48c7-864f-28ed4c715d89 is a valid instance name _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:126
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.181 183181 DEBUG nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] a29d6012-19fb-48c7-864f-28ed4c715d89 has a disk file _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:129
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.181 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.236 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.237 183181 DEBUG nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance a29d6012-19fb-48c7-864f-28ed4c715d89 is backed by 46c3eb6c81594e4d1392b4ffb0ccf21e15333434 _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:141
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.237 183181 DEBUG nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5 is a valid instance name _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:126
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.238 183181 DEBUG nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5 has a disk file _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:129
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.238 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.297 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.298 183181 DEBUG nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5 is backed by 46c3eb6c81594e4d1392b4ffb0ccf21e15333434 _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:141
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.298 183181 INFO nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Active base files: /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.298 183181 DEBUG nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:350
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.298 183181 DEBUG nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:299
Jan 26 20:07:34 compute-0 nova_compute[183177]: 2026-01-26 20:07:34.298 183181 DEBUG nova.virt.libvirt.imagecache [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:284
Jan 26 20:07:37 compute-0 sshd-session[215301]: Connection closed by authenticating user root 188.166.116.149 port 55304 [preauth]
Jan 26 20:07:37 compute-0 nova_compute[183177]: 2026-01-26 20:07:37.601 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:38 compute-0 nova_compute[183177]: 2026-01-26 20:07:38.442 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:40 compute-0 ovn_controller[95396]: 2026-01-26T20:07:40Z|00223|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Jan 26 20:07:41 compute-0 podman[215305]: 2026-01-26 20:07:41.35859832 +0000 UTC m=+0.081299869 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 20:07:41 compute-0 podman[215304]: 2026-01-26 20:07:41.390565051 +0000 UTC m=+0.118848450 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Jan 26 20:07:41 compute-0 podman[215303]: 2026-01-26 20:07:41.402539883 +0000 UTC m=+0.136440754 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, tcib_managed=true, io.buildah.version=1.41.4, config_id=ovn_controller)
Jan 26 20:07:42 compute-0 nova_compute[183177]: 2026-01-26 20:07:42.604 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:43 compute-0 nova_compute[183177]: 2026-01-26 20:07:43.445 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:47 compute-0 nova_compute[183177]: 2026-01-26 20:07:47.457 183181 DEBUG nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Check if temp file /var/lib/nova/instances/tmp6sm9vz_k exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 20:07:47 compute-0 nova_compute[183177]: 2026-01-26 20:07:47.459 183181 DEBUG nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Check if temp file /var/lib/nova/instances/tmpr_7ic3q9 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 20:07:47 compute-0 nova_compute[183177]: 2026-01-26 20:07:47.465 183181 DEBUG nova.compute.manager [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6sm9vz_k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a29d6012-19fb-48c7-864f-28ed4c715d89',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 20:07:47 compute-0 nova_compute[183177]: 2026-01-26 20:07:47.470 183181 DEBUG nova.compute.manager [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr_7ic3q9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='33cfa3c1-e4a7-4cc7-98f9-c638d97608f5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 20:07:47 compute-0 nova_compute[183177]: 2026-01-26 20:07:47.606 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:48 compute-0 podman[215373]: 2026-01-26 20:07:48.336022184 +0000 UTC m=+0.082212263 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 20:07:48 compute-0 nova_compute[183177]: 2026-01-26 20:07:48.466 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:48 compute-0 sshd-session[215371]: Connection closed by authenticating user root 142.93.140.142 port 55732 [preauth]
Jan 26 20:07:49 compute-0 sshd-session[215411]: Invalid user dbuser from 193.32.162.151 port 60190
Jan 26 20:07:49 compute-0 sshd-session[215411]: Connection closed by invalid user dbuser 193.32.162.151 port 60190 [preauth]
Jan 26 20:07:52 compute-0 nova_compute[183177]: 2026-01-26 20:07:52.516 183181 DEBUG oslo_concurrency.processutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:52 compute-0 nova_compute[183177]: 2026-01-26 20:07:52.610 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:52 compute-0 nova_compute[183177]: 2026-01-26 20:07:52.613 183181 DEBUG oslo_concurrency.processutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:52 compute-0 nova_compute[183177]: 2026-01-26 20:07:52.614 183181 DEBUG oslo_concurrency.processutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:07:52 compute-0 nova_compute[183177]: 2026-01-26 20:07:52.667 183181 DEBUG oslo_concurrency.processutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:07:52 compute-0 nova_compute[183177]: 2026-01-26 20:07:52.668 183181 DEBUG nova.compute.manager [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Preparing to wait for external event network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:07:52 compute-0 nova_compute[183177]: 2026-01-26 20:07:52.669 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:07:52 compute-0 nova_compute[183177]: 2026-01-26 20:07:52.669 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:07:52 compute-0 nova_compute[183177]: 2026-01-26 20:07:52.669 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:53 compute-0 nova_compute[183177]: 2026-01-26 20:07:53.467 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:57 compute-0 nova_compute[183177]: 2026-01-26 20:07:57.613 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:58 compute-0 nova_compute[183177]: 2026-01-26 20:07:58.507 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:07:59 compute-0 podman[192499]: time="2026-01-26T20:07:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:07:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:07:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:07:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:07:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2647 "" "Go-http-client/1.1"
Jan 26 20:07:59 compute-0 nova_compute[183177]: 2026-01-26 20:07:59.807 183181 DEBUG nova.compute.manager [req-d4c5f4c5-f2a9-4b1f-b1ea-bc465c7c8466 req-66bde820-1593-4c60-af35-53a437c759ca 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received event network-vif-unplugged-9fc4022a-3a46-4857-8c00-814af094ef10 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:07:59 compute-0 nova_compute[183177]: 2026-01-26 20:07:59.807 183181 DEBUG oslo_concurrency.lockutils [req-d4c5f4c5-f2a9-4b1f-b1ea-bc465c7c8466 req-66bde820-1593-4c60-af35-53a437c759ca 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:07:59 compute-0 nova_compute[183177]: 2026-01-26 20:07:59.807 183181 DEBUG oslo_concurrency.lockutils [req-d4c5f4c5-f2a9-4b1f-b1ea-bc465c7c8466 req-66bde820-1593-4c60-af35-53a437c759ca 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:07:59 compute-0 nova_compute[183177]: 2026-01-26 20:07:59.808 183181 DEBUG oslo_concurrency.lockutils [req-d4c5f4c5-f2a9-4b1f-b1ea-bc465c7c8466 req-66bde820-1593-4c60-af35-53a437c759ca 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:07:59 compute-0 nova_compute[183177]: 2026-01-26 20:07:59.808 183181 DEBUG nova.compute.manager [req-d4c5f4c5-f2a9-4b1f-b1ea-bc465c7c8466 req-66bde820-1593-4c60-af35-53a437c759ca 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] No event matching network-vif-unplugged-9fc4022a-3a46-4857-8c00-814af094ef10 in dict_keys([('network-vif-plugged', '9fc4022a-3a46-4857-8c00-814af094ef10')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 20:07:59 compute-0 nova_compute[183177]: 2026-01-26 20:07:59.808 183181 DEBUG nova.compute.manager [req-d4c5f4c5-f2a9-4b1f-b1ea-bc465c7c8466 req-66bde820-1593-4c60-af35-53a437c759ca 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received event network-vif-unplugged-9fc4022a-3a46-4857-8c00-814af094ef10 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:08:00 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:00.480 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:08:00 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:00.481 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:08:00 compute-0 nova_compute[183177]: 2026-01-26 20:08:00.481 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:01 compute-0 sshd[129547]: Timeout before authentication for connection from 101.126.147.62 to 38.102.83.58, pid = 214772
Jan 26 20:08:01 compute-0 openstack_network_exporter[195363]: ERROR   20:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:08:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:08:01 compute-0 openstack_network_exporter[195363]: ERROR   20:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:08:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:08:01 compute-0 nova_compute[183177]: 2026-01-26 20:08:01.911 183181 DEBUG nova.compute.manager [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received event network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:01 compute-0 nova_compute[183177]: 2026-01-26 20:08:01.911 183181 DEBUG oslo_concurrency.lockutils [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:01 compute-0 nova_compute[183177]: 2026-01-26 20:08:01.911 183181 DEBUG oslo_concurrency.lockutils [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:01 compute-0 nova_compute[183177]: 2026-01-26 20:08:01.912 183181 DEBUG oslo_concurrency.lockutils [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:01 compute-0 nova_compute[183177]: 2026-01-26 20:08:01.912 183181 DEBUG nova.compute.manager [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Processing event network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:08:01 compute-0 nova_compute[183177]: 2026-01-26 20:08:01.912 183181 DEBUG nova.compute.manager [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received event network-changed-9fc4022a-3a46-4857-8c00-814af094ef10 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:01 compute-0 nova_compute[183177]: 2026-01-26 20:08:01.912 183181 DEBUG nova.compute.manager [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Refreshing instance network info cache due to event network-changed-9fc4022a-3a46-4857-8c00-814af094ef10. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:08:01 compute-0 nova_compute[183177]: 2026-01-26 20:08:01.912 183181 DEBUG oslo_concurrency.lockutils [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-33cfa3c1-e4a7-4cc7-98f9-c638d97608f5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:08:01 compute-0 nova_compute[183177]: 2026-01-26 20:08:01.913 183181 DEBUG oslo_concurrency.lockutils [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-33cfa3c1-e4a7-4cc7-98f9-c638d97608f5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:08:01 compute-0 nova_compute[183177]: 2026-01-26 20:08:01.913 183181 DEBUG nova.network.neutron [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Refreshing network info cache for port 9fc4022a-3a46-4857-8c00-814af094ef10 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:08:02 compute-0 nova_compute[183177]: 2026-01-26 20:08:02.204 183181 INFO nova.compute.manager [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Took 9.53 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 26 20:08:02 compute-0 nova_compute[183177]: 2026-01-26 20:08:02.204 183181 DEBUG nova.compute.manager [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:08:02 compute-0 nova_compute[183177]: 2026-01-26 20:08:02.423 183181 WARNING neutronclient.v2_0.client [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:08:02 compute-0 nova_compute[183177]: 2026-01-26 20:08:02.617 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:02 compute-0 nova_compute[183177]: 2026-01-26 20:08:02.711 183181 DEBUG nova.compute.manager [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr_7ic3q9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='33cfa3c1-e4a7-4cc7-98f9-c638d97608f5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(8e4f96ae-c468-4acc-a288-d96612b036a4),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 20:08:02 compute-0 nova_compute[183177]: 2026-01-26 20:08:02.808 183181 WARNING neutronclient.v2_0.client [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.087 183181 DEBUG nova.network.neutron [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Updated VIF entry in instance network info cache for port 9fc4022a-3a46-4857-8c00-814af094ef10. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.087 183181 DEBUG nova.network.neutron [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Updating instance_info_cache with network_info: [{"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.227 183181 DEBUG nova.objects.instance [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.228 183181 DEBUG nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.230 183181 DEBUG nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.230 183181 DEBUG nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:08:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:03.495 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.521 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.597 183181 DEBUG oslo_concurrency.lockutils [req-663a8192-190f-43fe-b0c1-abc3dff89a08 req-5906c975-0055-4a2e-9175-0c87ff172807 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-33cfa3c1-e4a7-4cc7-98f9-c638d97608f5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.733 183181 DEBUG nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.733 183181 DEBUG nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.744 183181 DEBUG nova.virt.libvirt.vif [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:06:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-781413564',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-7814135',id=29,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:07:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-lon7s37m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:07:12Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=33cfa3c1-e4a7-4cc7-98f9-c638d97608f5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.744 183181 DEBUG nova.network.os_vif_util [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.745 183181 DEBUG nova.network.os_vif_util [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:84:2b,bridge_name='br-int',has_traffic_filtering=True,id=9fc4022a-3a46-4857-8c00-814af094ef10,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc4022a-3a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.746 183181 DEBUG nova.virt.libvirt.migration [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <mac address="fa:16:3e:03:84:2b"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <model type="virtio"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <mtu size="1442"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <target dev="tap9fc4022a-3a"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]: </interface>
Jan 26 20:08:03 compute-0 nova_compute[183177]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.747 183181 DEBUG nova.virt.libvirt.migration [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <name>instance-0000001d</name>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <uuid>33cfa3c1-e4a7-4cc7-98f9-c638d97608f5</uuid>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-781413564</nova:name>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:07:07</nova:creationTime>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:08:03 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:08:03 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:port uuid="9fc4022a-3a46-4857-8c00-814af094ef10">
Jan 26 20:08:03 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <system>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="serial">33cfa3c1-e4a7-4cc7-98f9-c638d97608f5</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="uuid">33cfa3c1-e4a7-4cc7-98f9-c638d97608f5</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </system>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <os>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </os>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <features>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </features>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk.config"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:03:84:2b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9fc4022a-3a"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/console.log" append="off"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </target>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/console.log" append="off"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </console>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </input>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <video>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </video>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]: </domain>
Jan 26 20:08:03 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.748 183181 DEBUG nova.virt.libvirt.migration [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <name>instance-0000001d</name>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <uuid>33cfa3c1-e4a7-4cc7-98f9-c638d97608f5</uuid>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-781413564</nova:name>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:07:07</nova:creationTime>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:08:03 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:08:03 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:port uuid="9fc4022a-3a46-4857-8c00-814af094ef10">
Jan 26 20:08:03 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <system>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="serial">33cfa3c1-e4a7-4cc7-98f9-c638d97608f5</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="uuid">33cfa3c1-e4a7-4cc7-98f9-c638d97608f5</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </system>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <os>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </os>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <features>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </features>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk.config"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:03:84:2b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9fc4022a-3a"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/console.log" append="off"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </target>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/console.log" append="off"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </console>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </input>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <video>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </video>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]: </domain>
Jan 26 20:08:03 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.748 183181 DEBUG nova.virt.libvirt.migration [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <name>instance-0000001d</name>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <uuid>33cfa3c1-e4a7-4cc7-98f9-c638d97608f5</uuid>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-781413564</nova:name>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:07:07</nova:creationTime>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:08:03 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:08:03 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <nova:port uuid="9fc4022a-3a46-4857-8c00-814af094ef10">
Jan 26 20:08:03 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <system>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="serial">33cfa3c1-e4a7-4cc7-98f9-c638d97608f5</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="uuid">33cfa3c1-e4a7-4cc7-98f9-c638d97608f5</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </system>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <os>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </os>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <features>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </features>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/disk.config"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:03:84:2b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9fc4022a-3a"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/console.log" append="off"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:08:03 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       </target>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5/console.log" append="off"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </console>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </input>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <video>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </video>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:08:03 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:08:03 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:08:03 compute-0 nova_compute[183177]: </domain>
Jan 26 20:08:03 compute-0 nova_compute[183177]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 20:08:03 compute-0 nova_compute[183177]: 2026-01-26 20:08:03.749 183181 DEBUG nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 20:08:04 compute-0 nova_compute[183177]: 2026-01-26 20:08:04.235 183181 DEBUG nova.virt.libvirt.migration [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:08:04 compute-0 nova_compute[183177]: 2026-01-26 20:08:04.235 183181 INFO nova.virt.libvirt.migration [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 20:08:05 compute-0 nova_compute[183177]: 2026-01-26 20:08:05.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:08:05 compute-0 nova_compute[183177]: 2026-01-26 20:08:05.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:08:05 compute-0 nova_compute[183177]: 2026-01-26 20:08:05.154 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 20:08:05 compute-0 nova_compute[183177]: 2026-01-26 20:08:05.260 183181 INFO nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 20:08:05 compute-0 nova_compute[183177]: 2026-01-26 20:08:05.661 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 20:08:05 compute-0 nova_compute[183177]: 2026-01-26 20:08:05.774 183181 DEBUG nova.virt.libvirt.migration [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:08:05 compute-0 nova_compute[183177]: 2026-01-26 20:08:05.774 183181 DEBUG nova.virt.libvirt.migration [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 20:08:05 compute-0 kernel: tap9fc4022a-3a (unregistering): left promiscuous mode
Jan 26 20:08:05 compute-0 NetworkManager[55489]: <info>  [1769458085.9165] device (tap9fc4022a-3a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 20:08:05 compute-0 ovn_controller[95396]: 2026-01-26T20:08:05Z|00224|binding|INFO|Releasing lport 9fc4022a-3a46-4857-8c00-814af094ef10 from this chassis (sb_readonly=0)
Jan 26 20:08:05 compute-0 ovn_controller[95396]: 2026-01-26T20:08:05Z|00225|binding|INFO|Setting lport 9fc4022a-3a46-4857-8c00-814af094ef10 down in Southbound
Jan 26 20:08:05 compute-0 nova_compute[183177]: 2026-01-26 20:08:05.961 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:05 compute-0 ovn_controller[95396]: 2026-01-26T20:08:05Z|00226|binding|INFO|Removing iface tap9fc4022a-3a ovn-installed in OVS
Jan 26 20:08:05 compute-0 nova_compute[183177]: 2026-01-26 20:08:05.963 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:05.973 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:84:2b 10.100.0.7'], port_security=['fa:16:3e:03:84:2b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c86a094a-ada8-46ad-9c23-c857e9a7b834'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '33cfa3c1-e4a7-4cc7-98f9-c638d97608f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d30bc5631f24a6799364d53cb4e9465', 'neutron:revision_number': '10', 'neutron:security_group_ids': '2edde6da-001c-4935-bbf6-81cd72a69f91', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8598e857-e8d7-4ecf-97dd-17cf2e766e3a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=9fc4022a-3a46-4857-8c00-814af094ef10) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:08:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:05.975 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 9fc4022a-3a46-4857-8c00-814af094ef10 in datapath d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c unbound from our chassis
Jan 26 20:08:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:05.976 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c
Jan 26 20:08:05 compute-0 nova_compute[183177]: 2026-01-26 20:08:05.977 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:05 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:05.992 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[582fcb34-b169-43d8-937a-1c9c70bb5503]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:06 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 26 20:08:06 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Consumed 14.758s CPU time.
Jan 26 20:08:06 compute-0 systemd-machined[154465]: Machine qemu-22-instance-0000001d terminated.
Jan 26 20:08:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:06.023 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[bb84e13f-43ba-4115-957c-70df3ec6882b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:06.026 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[66840ebd-d54f-49b8-8e71-7c491c08dd46]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:06.051 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[894c94f2-3cd0-45ee-a721-806fa1fab7e5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:06.072 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[67f03c43-0823-4b1a-8a7b-26792066ffd7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f030a0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:c2:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546788, 'reachable_time': 38542, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215448, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:06.087 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e346f866-9811-41bf-9445-d35e7432b1e1]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd9f030a0-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546805, 'tstamp': 546805}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215449, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd9f030a0-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546810, 'tstamp': 546810}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215449, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:06.088 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f030a0-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.090 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.094 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:06.094 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9f030a0-20, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:08:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:06.094 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:08:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:06.095 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9f030a0-20, col_values=(('external_ids', {'iface-id': '63338c40-169b-4962-a6c8-8ca20b375080'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:08:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:06.095 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:08:06 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:06.096 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3e6965f8-f3a1-4689-844d-72d0c3c75465]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.161 183181 DEBUG nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.161 183181 DEBUG nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.162 183181 DEBUG nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.277 183181 DEBUG nova.virt.libvirt.guest [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '33cfa3c1-e4a7-4cc7-98f9-c638d97608f5' (instance-0000001d) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.277 183181 INFO nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Migration operation has completed
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.278 183181 INFO nova.compute.manager [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] _post_live_migration() is started..
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.349 183181 WARNING neutronclient.v2_0.client [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.349 183181 WARNING neutronclient.v2_0.client [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.598 183181 DEBUG nova.compute.manager [req-48184a9d-455e-4c6d-bb40-ea6f2bbbd5b1 req-756d2f69-e430-43ad-8acd-eed246e86c95 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received event network-vif-unplugged-9fc4022a-3a46-4857-8c00-814af094ef10 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.599 183181 DEBUG oslo_concurrency.lockutils [req-48184a9d-455e-4c6d-bb40-ea6f2bbbd5b1 req-756d2f69-e430-43ad-8acd-eed246e86c95 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.599 183181 DEBUG oslo_concurrency.lockutils [req-48184a9d-455e-4c6d-bb40-ea6f2bbbd5b1 req-756d2f69-e430-43ad-8acd-eed246e86c95 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.599 183181 DEBUG oslo_concurrency.lockutils [req-48184a9d-455e-4c6d-bb40-ea6f2bbbd5b1 req-756d2f69-e430-43ad-8acd-eed246e86c95 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.599 183181 DEBUG nova.compute.manager [req-48184a9d-455e-4c6d-bb40-ea6f2bbbd5b1 req-756d2f69-e430-43ad-8acd-eed246e86c95 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] No waiting events found dispatching network-vif-unplugged-9fc4022a-3a46-4857-8c00-814af094ef10 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.601 183181 DEBUG nova.compute.manager [req-48184a9d-455e-4c6d-bb40-ea6f2bbbd5b1 req-756d2f69-e430-43ad-8acd-eed246e86c95 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received event network-vif-unplugged-9fc4022a-3a46-4857-8c00-814af094ef10 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.973 183181 DEBUG nova.network.neutron [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Activated binding for port 9fc4022a-3a46-4857-8c00-814af094ef10 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.974 183181 DEBUG nova.compute.manager [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.975 183181 DEBUG nova.virt.libvirt.vif [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:06:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-781413564',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-7814135',id=29,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:07:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-lon7s37m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:07:42Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=33cfa3c1-e4a7-4cc7-98f9-c638d97608f5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.976 183181 DEBUG nova.network.os_vif_util [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "9fc4022a-3a46-4857-8c00-814af094ef10", "address": "fa:16:3e:03:84:2b", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc4022a-3a", "ovs_interfaceid": "9fc4022a-3a46-4857-8c00-814af094ef10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.977 183181 DEBUG nova.network.os_vif_util [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:84:2b,bridge_name='br-int',has_traffic_filtering=True,id=9fc4022a-3a46-4857-8c00-814af094ef10,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc4022a-3a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.977 183181 DEBUG os_vif [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:84:2b,bridge_name='br-int',has_traffic_filtering=True,id=9fc4022a-3a46-4857-8c00-814af094ef10,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc4022a-3a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.980 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.980 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9fc4022a-3a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.982 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.985 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.987 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.987 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=12b910fc-1b46-4f38-bfc1-a0f02ebd1eae) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.988 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.989 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.992 183181 INFO os_vif [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:84:2b,bridge_name='br-int',has_traffic_filtering=True,id=9fc4022a-3a46-4857-8c00-814af094ef10,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc4022a-3a')
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.992 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.993 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.993 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.994 183181 DEBUG nova.compute.manager [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.995 183181 INFO nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Deleting instance files /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5_del
Jan 26 20:08:06 compute-0 nova_compute[183177]: 2026-01-26 20:08:06.995 183181 INFO nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Deletion of /var/lib/nova/instances/33cfa3c1-e4a7-4cc7-98f9-c638d97608f5_del complete
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.523 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.708 183181 DEBUG nova.compute.manager [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received event network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.709 183181 DEBUG oslo_concurrency.lockutils [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.709 183181 DEBUG oslo_concurrency.lockutils [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.709 183181 DEBUG oslo_concurrency.lockutils [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.710 183181 DEBUG nova.compute.manager [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] No waiting events found dispatching network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.710 183181 WARNING nova.compute.manager [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received unexpected event network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 for instance with vm_state active and task_state migrating.
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.711 183181 DEBUG nova.compute.manager [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received event network-vif-unplugged-9fc4022a-3a46-4857-8c00-814af094ef10 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.711 183181 DEBUG oslo_concurrency.lockutils [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.711 183181 DEBUG oslo_concurrency.lockutils [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.711 183181 DEBUG oslo_concurrency.lockutils [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.712 183181 DEBUG nova.compute.manager [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] No waiting events found dispatching network-vif-unplugged-9fc4022a-3a46-4857-8c00-814af094ef10 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.712 183181 DEBUG nova.compute.manager [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received event network-vif-unplugged-9fc4022a-3a46-4857-8c00-814af094ef10 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.712 183181 DEBUG nova.compute.manager [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received event network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.713 183181 DEBUG oslo_concurrency.lockutils [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.713 183181 DEBUG oslo_concurrency.lockutils [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.714 183181 DEBUG oslo_concurrency.lockutils [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.714 183181 DEBUG nova.compute.manager [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] No waiting events found dispatching network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:08:08 compute-0 nova_compute[183177]: 2026-01-26 20:08:08.714 183181 WARNING nova.compute.manager [req-5a8eb398-c886-4e5e-8114-f2187043937b req-78e7a9b3-8582-4c98-9fa3-f0cbdc2615d5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Received unexpected event network-vif-plugged-9fc4022a-3a46-4857-8c00-814af094ef10 for instance with vm_state active and task_state migrating.
Jan 26 20:08:09 compute-0 nova_compute[183177]: 2026-01-26 20:08:09.661 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:08:11 compute-0 nova_compute[183177]: 2026-01-26 20:08:11.989 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:12 compute-0 nova_compute[183177]: 2026-01-26 20:08:12.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:08:12 compute-0 nova_compute[183177]: 2026-01-26 20:08:12.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 20:08:12 compute-0 podman[215469]: 2026-01-26 20:08:12.350818655 +0000 UTC m=+0.088137993 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Jan 26 20:08:12 compute-0 podman[215470]: 2026-01-26 20:08:12.375541611 +0000 UTC m=+0.104554046 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 20:08:12 compute-0 podman[215468]: 2026-01-26 20:08:12.391360306 +0000 UTC m=+0.129013813 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 20:08:13 compute-0 nova_compute[183177]: 2026-01-26 20:08:13.560 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:14 compute-0 nova_compute[183177]: 2026-01-26 20:08:14.657 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:08:14 compute-0 nova_compute[183177]: 2026-01-26 20:08:14.658 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:08:15 compute-0 nova_compute[183177]: 2026-01-26 20:08:15.173 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:15 compute-0 nova_compute[183177]: 2026-01-26 20:08:15.174 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:15 compute-0 nova_compute[183177]: 2026-01-26 20:08:15.175 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:15 compute-0 nova_compute[183177]: 2026-01-26 20:08:15.175 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:08:16 compute-0 nova_compute[183177]: 2026-01-26 20:08:16.231 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:08:16 compute-0 nova_compute[183177]: 2026-01-26 20:08:16.294 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:08:16 compute-0 nova_compute[183177]: 2026-01-26 20:08:16.295 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:08:16 compute-0 nova_compute[183177]: 2026-01-26 20:08:16.354 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:08:16 compute-0 nova_compute[183177]: 2026-01-26 20:08:16.504 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:08:16 compute-0 nova_compute[183177]: 2026-01-26 20:08:16.506 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:08:16 compute-0 nova_compute[183177]: 2026-01-26 20:08:16.543 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:08:16 compute-0 nova_compute[183177]: 2026-01-26 20:08:16.544 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5582MB free_disk=73.06155776977539GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:08:16 compute-0 nova_compute[183177]: 2026-01-26 20:08:16.544 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:16 compute-0 nova_compute[183177]: 2026-01-26 20:08:16.544 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:16 compute-0 nova_compute[183177]: 2026-01-26 20:08:16.992 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:17 compute-0 nova_compute[183177]: 2026-01-26 20:08:17.565 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Migration for instance 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 20:08:18 compute-0 nova_compute[183177]: 2026-01-26 20:08:18.046 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:18 compute-0 nova_compute[183177]: 2026-01-26 20:08:18.047 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:18 compute-0 nova_compute[183177]: 2026-01-26 20:08:18.047 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "33cfa3c1-e4a7-4cc7-98f9-c638d97608f5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:18 compute-0 nova_compute[183177]: 2026-01-26 20:08:18.074 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 20:08:18 compute-0 nova_compute[183177]: 2026-01-26 20:08:18.075 183181 INFO nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Updating resource usage from migration b2f8df64-c128-4969-b068-1d1aa1bed397
Jan 26 20:08:18 compute-0 nova_compute[183177]: 2026-01-26 20:08:18.120 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Migration b2f8df64-c128-4969-b068-1d1aa1bed397 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:08:18 compute-0 nova_compute[183177]: 2026-01-26 20:08:18.120 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Migration 8e4f96ae-c468-4acc-a288-d96612b036a4 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:08:18 compute-0 nova_compute[183177]: 2026-01-26 20:08:18.121 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:08:18 compute-0 nova_compute[183177]: 2026-01-26 20:08:18.121 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:08:16 up  1:32,  0 user,  load average: 0.38, 0.32, 0.29\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_8d30bc5631f24a6799364d53cb4e9465': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:08:18 compute-0 sshd-session[215541]: Connection closed by authenticating user root 188.166.116.149 port 42510 [preauth]
Jan 26 20:08:18 compute-0 nova_compute[183177]: 2026-01-26 20:08:18.198 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:08:18 compute-0 nova_compute[183177]: 2026-01-26 20:08:18.560 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:18 compute-0 nova_compute[183177]: 2026-01-26 20:08:18.598 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:18 compute-0 nova_compute[183177]: 2026-01-26 20:08:18.708 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:08:19 compute-0 nova_compute[183177]: 2026-01-26 20:08:19.222 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:08:19 compute-0 nova_compute[183177]: 2026-01-26 20:08:19.223 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.679s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:19 compute-0 nova_compute[183177]: 2026-01-26 20:08:19.223 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.663s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:19 compute-0 nova_compute[183177]: 2026-01-26 20:08:19.224 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:19 compute-0 nova_compute[183177]: 2026-01-26 20:08:19.224 183181 DEBUG nova.compute.resource_tracker [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:08:19 compute-0 podman[215543]: 2026-01-26 20:08:19.337231663 +0000 UTC m=+0.073124439 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.266 183181 DEBUG oslo_concurrency.processutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.352 183181 DEBUG oslo_concurrency.processutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.355 183181 DEBUG oslo_concurrency.processutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.421 183181 DEBUG oslo_concurrency.processutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.560 183181 WARNING nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.562 183181 DEBUG oslo_concurrency.processutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.578 183181 DEBUG oslo_concurrency.processutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.579 183181 DEBUG nova.compute.resource_tracker [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5583MB free_disk=73.06155776977539GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.579 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.580 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.718 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.719 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.719 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:08:20 compute-0 nova_compute[183177]: 2026-01-26 20:08:20.720 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:08:21 compute-0 nova_compute[183177]: 2026-01-26 20:08:21.632 183181 DEBUG nova.compute.resource_tracker [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 20:08:21 compute-0 nova_compute[183177]: 2026-01-26 20:08:21.995 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:22 compute-0 nova_compute[183177]: 2026-01-26 20:08:22.141 183181 DEBUG nova.compute.resource_tracker [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 20:08:22 compute-0 nova_compute[183177]: 2026-01-26 20:08:22.142 183181 INFO nova.compute.resource_tracker [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Updating resource usage from migration b2f8df64-c128-4969-b068-1d1aa1bed397
Jan 26 20:08:22 compute-0 nova_compute[183177]: 2026-01-26 20:08:22.172 183181 DEBUG nova.compute.resource_tracker [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration b2f8df64-c128-4969-b068-1d1aa1bed397 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:08:22 compute-0 nova_compute[183177]: 2026-01-26 20:08:22.173 183181 DEBUG nova.compute.resource_tracker [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration 8e4f96ae-c468-4acc-a288-d96612b036a4 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:08:22 compute-0 nova_compute[183177]: 2026-01-26 20:08:22.173 183181 DEBUG nova.compute.resource_tracker [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:08:22 compute-0 nova_compute[183177]: 2026-01-26 20:08:22.174 183181 DEBUG nova.compute.resource_tracker [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:08:20 up  1:32,  0 user,  load average: 0.35, 0.32, 0.29\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_8d30bc5631f24a6799364d53cb4e9465': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:08:22 compute-0 nova_compute[183177]: 2026-01-26 20:08:22.258 183181 DEBUG nova.compute.provider_tree [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:08:22 compute-0 nova_compute[183177]: 2026-01-26 20:08:22.769 183181 DEBUG nova.scheduler.client.report [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:08:23 compute-0 nova_compute[183177]: 2026-01-26 20:08:23.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:08:23 compute-0 nova_compute[183177]: 2026-01-26 20:08:23.282 183181 DEBUG nova.compute.resource_tracker [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:08:23 compute-0 nova_compute[183177]: 2026-01-26 20:08:23.282 183181 DEBUG oslo_concurrency.lockutils [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.703s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:23 compute-0 nova_compute[183177]: 2026-01-26 20:08:23.304 183181 INFO nova.compute.manager [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 26 20:08:23 compute-0 nova_compute[183177]: 2026-01-26 20:08:23.633 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:24.104 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:24.105 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:24.105 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:24 compute-0 nova_compute[183177]: 2026-01-26 20:08:24.380 183181 INFO nova.scheduler.client.report [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration 8e4f96ae-c468-4acc-a288-d96612b036a4
Jan 26 20:08:24 compute-0 nova_compute[183177]: 2026-01-26 20:08:24.380 183181 DEBUG nova.virt.libvirt.driver [None req-92f0e8bb-c9d7-42c8-9aac-529140559922 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 33cfa3c1-e4a7-4cc7-98f9-c638d97608f5] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 20:08:25 compute-0 nova_compute[183177]: 2026-01-26 20:08:25.400 183181 DEBUG oslo_concurrency.processutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:08:25 compute-0 nova_compute[183177]: 2026-01-26 20:08:25.495 183181 DEBUG oslo_concurrency.processutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:08:25 compute-0 nova_compute[183177]: 2026-01-26 20:08:25.496 183181 DEBUG oslo_concurrency.processutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:08:25 compute-0 nova_compute[183177]: 2026-01-26 20:08:25.581 183181 DEBUG oslo_concurrency.processutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:08:25 compute-0 nova_compute[183177]: 2026-01-26 20:08:25.582 183181 DEBUG nova.compute.manager [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Preparing to wait for external event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:08:25 compute-0 nova_compute[183177]: 2026-01-26 20:08:25.582 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:25 compute-0 nova_compute[183177]: 2026-01-26 20:08:25.582 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:25 compute-0 nova_compute[183177]: 2026-01-26 20:08:25.582 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:27 compute-0 nova_compute[183177]: 2026-01-26 20:08:27.028 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:27 compute-0 sshd-session[215581]: Connection closed by authenticating user root 142.93.140.142 port 49136 [preauth]
Jan 26 20:08:28 compute-0 nova_compute[183177]: 2026-01-26 20:08:28.148 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:08:28 compute-0 nova_compute[183177]: 2026-01-26 20:08:28.635 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:29 compute-0 podman[192499]: time="2026-01-26T20:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:08:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:08:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2644 "" "Go-http-client/1.1"
Jan 26 20:08:31 compute-0 openstack_network_exporter[195363]: ERROR   20:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:08:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:08:31 compute-0 openstack_network_exporter[195363]: ERROR   20:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:08:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:08:32 compute-0 nova_compute[183177]: 2026-01-26 20:08:32.030 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:32 compute-0 nova_compute[183177]: 2026-01-26 20:08:32.953 183181 DEBUG nova.compute.manager [req-a8e5359f-0219-436a-8ea4-a4fcef258de5 req-98e53c4c-0210-4f26-9ebe-5693b3935448 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-unplugged-bf218032-dd19-417b-ad93-d29e2b451fce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:32 compute-0 nova_compute[183177]: 2026-01-26 20:08:32.954 183181 DEBUG oslo_concurrency.lockutils [req-a8e5359f-0219-436a-8ea4-a4fcef258de5 req-98e53c4c-0210-4f26-9ebe-5693b3935448 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:32 compute-0 nova_compute[183177]: 2026-01-26 20:08:32.954 183181 DEBUG oslo_concurrency.lockutils [req-a8e5359f-0219-436a-8ea4-a4fcef258de5 req-98e53c4c-0210-4f26-9ebe-5693b3935448 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:32 compute-0 nova_compute[183177]: 2026-01-26 20:08:32.955 183181 DEBUG oslo_concurrency.lockutils [req-a8e5359f-0219-436a-8ea4-a4fcef258de5 req-98e53c4c-0210-4f26-9ebe-5693b3935448 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:32 compute-0 nova_compute[183177]: 2026-01-26 20:08:32.955 183181 DEBUG nova.compute.manager [req-a8e5359f-0219-436a-8ea4-a4fcef258de5 req-98e53c4c-0210-4f26-9ebe-5693b3935448 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] No event matching network-vif-unplugged-bf218032-dd19-417b-ad93-d29e2b451fce in dict_keys([('network-vif-plugged', 'bf218032-dd19-417b-ad93-d29e2b451fce')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 20:08:32 compute-0 nova_compute[183177]: 2026-01-26 20:08:32.955 183181 DEBUG nova.compute.manager [req-a8e5359f-0219-436a-8ea4-a4fcef258de5 req-98e53c4c-0210-4f26-9ebe-5693b3935448 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-unplugged-bf218032-dd19-417b-ad93-d29e2b451fce for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:08:33 compute-0 nova_compute[183177]: 2026-01-26 20:08:33.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:08:33 compute-0 nova_compute[183177]: 2026-01-26 20:08:33.669 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:34 compute-0 nova_compute[183177]: 2026-01-26 20:08:34.610 183181 INFO nova.compute.manager [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Took 9.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 26 20:08:35 compute-0 nova_compute[183177]: 2026-01-26 20:08:35.022 183181 DEBUG nova.compute.manager [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:35 compute-0 nova_compute[183177]: 2026-01-26 20:08:35.023 183181 DEBUG oslo_concurrency.lockutils [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:35 compute-0 nova_compute[183177]: 2026-01-26 20:08:35.023 183181 DEBUG oslo_concurrency.lockutils [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:35 compute-0 nova_compute[183177]: 2026-01-26 20:08:35.024 183181 DEBUG oslo_concurrency.lockutils [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:35 compute-0 nova_compute[183177]: 2026-01-26 20:08:35.024 183181 DEBUG nova.compute.manager [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Processing event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:08:35 compute-0 nova_compute[183177]: 2026-01-26 20:08:35.025 183181 DEBUG nova.compute.manager [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-changed-bf218032-dd19-417b-ad93-d29e2b451fce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:35 compute-0 nova_compute[183177]: 2026-01-26 20:08:35.025 183181 DEBUG nova.compute.manager [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Refreshing instance network info cache due to event network-changed-bf218032-dd19-417b-ad93-d29e2b451fce. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:08:35 compute-0 nova_compute[183177]: 2026-01-26 20:08:35.026 183181 DEBUG oslo_concurrency.lockutils [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-a29d6012-19fb-48c7-864f-28ed4c715d89" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:08:35 compute-0 nova_compute[183177]: 2026-01-26 20:08:35.026 183181 DEBUG oslo_concurrency.lockutils [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-a29d6012-19fb-48c7-864f-28ed4c715d89" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:08:35 compute-0 nova_compute[183177]: 2026-01-26 20:08:35.026 183181 DEBUG nova.network.neutron [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Refreshing network info cache for port bf218032-dd19-417b-ad93-d29e2b451fce _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:08:35 compute-0 nova_compute[183177]: 2026-01-26 20:08:35.028 183181 DEBUG nova.compute.manager [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:08:35 compute-0 nova_compute[183177]: 2026-01-26 20:08:35.534 183181 WARNING neutronclient.v2_0.client [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:08:35 compute-0 nova_compute[183177]: 2026-01-26 20:08:35.543 183181 DEBUG nova.compute.manager [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6sm9vz_k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a29d6012-19fb-48c7-864f-28ed4c715d89',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(b2f8df64-c128-4969-b068-1d1aa1bed397),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.060 183181 DEBUG nova.objects.instance [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid a29d6012-19fb-48c7-864f-28ed4c715d89 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.061 183181 DEBUG nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.063 183181 DEBUG nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.063 183181 DEBUG nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.565 183181 DEBUG nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.565 183181 DEBUG nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.574 183181 DEBUG nova.virt.libvirt.vif [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:06:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-489971786',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-4899717',id=28,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:06:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-45j0gckj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:06:50Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=a29d6012-19fb-48c7-864f-28ed4c715d89,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.575 183181 DEBUG nova.network.os_vif_util [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.576 183181 DEBUG nova.network.os_vif_util [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:80:11,bridge_name='br-int',has_traffic_filtering=True,id=bf218032-dd19-417b-ad93-d29e2b451fce,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf218032-dd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.576 183181 DEBUG nova.virt.libvirt.migration [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <mac address="fa:16:3e:eb:80:11"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <model type="virtio"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <mtu size="1442"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <target dev="tapbf218032-dd"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]: </interface>
Jan 26 20:08:36 compute-0 nova_compute[183177]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.577 183181 DEBUG nova.virt.libvirt.migration [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <name>instance-0000001c</name>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <uuid>a29d6012-19fb-48c7-864f-28ed4c715d89</uuid>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-489971786</nova:name>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:06:44</nova:creationTime>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:08:36 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:08:36 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:port uuid="bf218032-dd19-417b-ad93-d29e2b451fce">
Jan 26 20:08:36 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <system>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="serial">a29d6012-19fb-48c7-864f-28ed4c715d89</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="uuid">a29d6012-19fb-48c7-864f-28ed4c715d89</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </system>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <os>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </os>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <features>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </features>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk.config"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:eb:80:11"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbf218032-dd"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/console.log" append="off"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </target>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/console.log" append="off"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </console>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </input>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <video>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </video>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]: </domain>
Jan 26 20:08:36 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.578 183181 DEBUG nova.virt.libvirt.migration [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <name>instance-0000001c</name>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <uuid>a29d6012-19fb-48c7-864f-28ed4c715d89</uuid>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-489971786</nova:name>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:06:44</nova:creationTime>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:08:36 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:08:36 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:port uuid="bf218032-dd19-417b-ad93-d29e2b451fce">
Jan 26 20:08:36 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <system>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="serial">a29d6012-19fb-48c7-864f-28ed4c715d89</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="uuid">a29d6012-19fb-48c7-864f-28ed4c715d89</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </system>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <os>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </os>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <features>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </features>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk.config"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:eb:80:11"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbf218032-dd"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/console.log" append="off"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </target>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/console.log" append="off"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </console>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </input>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <video>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </video>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]: </domain>
Jan 26 20:08:36 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.578 183181 DEBUG nova.virt.libvirt.migration [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <name>instance-0000001c</name>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <uuid>a29d6012-19fb-48c7-864f-28ed4c715d89</uuid>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteWorkloadStabilizationStrategy-server-489971786</nova:name>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:06:44</nova:creationTime>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:08:36 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:08:36 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:user uuid="b3d5258d30ef4be39230c019f11bed8f">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin</nova:user>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:project uuid="8d30bc5631f24a6799364d53cb4e9465">tempest-TestExecuteWorkloadStabilizationStrategy-1560930857</nova:project>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <nova:port uuid="bf218032-dd19-417b-ad93-d29e2b451fce">
Jan 26 20:08:36 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <system>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="serial">a29d6012-19fb-48c7-864f-28ed4c715d89</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="uuid">a29d6012-19fb-48c7-864f-28ed4c715d89</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </system>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <os>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </os>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <features>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </features>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/disk.config"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:eb:80:11"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbf218032-dd"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/console.log" append="off"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:08:36 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       </target>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89/console.log" append="off"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </console>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </input>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <video>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </video>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:08:36 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:08:36 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:08:36 compute-0 nova_compute[183177]: </domain>
Jan 26 20:08:36 compute-0 nova_compute[183177]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.578 183181 DEBUG nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.769 183181 WARNING neutronclient.v2_0.client [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.924 183181 DEBUG nova.network.neutron [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Updated VIF entry in instance network info cache for port bf218032-dd19-417b-ad93-d29e2b451fce. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 20:08:36 compute-0 nova_compute[183177]: 2026-01-26 20:08:36.924 183181 DEBUG nova.network.neutron [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Updating instance_info_cache with network_info: [{"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:08:36 compute-0 ovn_controller[95396]: 2026-01-26T20:08:36Z|00227|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 26 20:08:37 compute-0 nova_compute[183177]: 2026-01-26 20:08:37.033 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:37 compute-0 nova_compute[183177]: 2026-01-26 20:08:37.067 183181 DEBUG nova.virt.libvirt.migration [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:08:37 compute-0 nova_compute[183177]: 2026-01-26 20:08:37.068 183181 INFO nova.virt.libvirt.migration [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 20:08:37 compute-0 nova_compute[183177]: 2026-01-26 20:08:37.437 183181 DEBUG oslo_concurrency.lockutils [req-0c48c0d1-aa50-4404-ac1b-672b75b18efc req-18672b89-0f92-4822-b227-f6457a9711e5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-a29d6012-19fb-48c7-864f-28ed4c715d89" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.087 183181 INFO nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 20:08:38 compute-0 kernel: tapbf218032-dd (unregistering): left promiscuous mode
Jan 26 20:08:38 compute-0 NetworkManager[55489]: <info>  [1769458118.4315] device (tapbf218032-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 20:08:38 compute-0 ovn_controller[95396]: 2026-01-26T20:08:38Z|00228|binding|INFO|Releasing lport bf218032-dd19-417b-ad93-d29e2b451fce from this chassis (sb_readonly=0)
Jan 26 20:08:38 compute-0 ovn_controller[95396]: 2026-01-26T20:08:38Z|00229|binding|INFO|Setting lport bf218032-dd19-417b-ad93-d29e2b451fce down in Southbound
Jan 26 20:08:38 compute-0 ovn_controller[95396]: 2026-01-26T20:08:38Z|00230|binding|INFO|Removing iface tapbf218032-dd ovn-installed in OVS
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.438 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.439 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.445 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:80:11 10.100.0.10'], port_security=['fa:16:3e:eb:80:11 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c86a094a-ada8-46ad-9c23-c857e9a7b834'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'a29d6012-19fb-48c7-864f-28ed4c715d89', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d30bc5631f24a6799364d53cb4e9465', 'neutron:revision_number': '10', 'neutron:security_group_ids': '2edde6da-001c-4935-bbf6-81cd72a69f91', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8598e857-e8d7-4ecf-97dd-17cf2e766e3a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=bf218032-dd19-417b-ad93-d29e2b451fce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.446 104672 INFO neutron.agent.ovn.metadata.agent [-] Port bf218032-dd19-417b-ad93-d29e2b451fce in datapath d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c unbound from our chassis
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.447 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.448 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[02368819-cbf2-41e4-a577-4299f32b98a1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.448 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c namespace which is not needed anymore
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.471 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:38 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Jan 26 20:08:38 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001c.scope: Consumed 18.106s CPU time.
Jan 26 20:08:38 compute-0 systemd-machined[154465]: Machine qemu-21-instance-0000001c terminated.
Jan 26 20:08:38 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[215080]: [NOTICE]   (215084) : haproxy version is 3.0.5-8e879a5
Jan 26 20:08:38 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[215080]: [NOTICE]   (215084) : path to executable is /usr/sbin/haproxy
Jan 26 20:08:38 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[215080]: [WARNING]  (215084) : Exiting Master process...
Jan 26 20:08:38 compute-0 podman[215628]: 2026-01-26 20:08:38.595888832 +0000 UTC m=+0.036539976 container kill 429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 20:08:38 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[215080]: [ALERT]    (215084) : Current worker (215086) exited with code 143 (Terminated)
Jan 26 20:08:38 compute-0 neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c[215080]: [WARNING]  (215084) : All workers exited. Exiting... (0)
Jan 26 20:08:38 compute-0 systemd[1]: libpod-429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80.scope: Deactivated successfully.
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.629 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.634 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:38 compute-0 podman[215644]: 2026-01-26 20:08:38.651097857 +0000 UTC m=+0.032537406 container died 429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.668 183181 DEBUG nova.virt.libvirt.guest [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.669 183181 INFO nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Migration operation has completed
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.669 183181 INFO nova.compute.manager [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] _post_live_migration() is started..
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.670 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.673 183181 DEBUG nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.673 183181 DEBUG nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.673 183181 DEBUG nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.681 183181 WARNING neutronclient.v2_0.client [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.682 183181 WARNING neutronclient.v2_0.client [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:08:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80-userdata-shm.mount: Deactivated successfully.
Jan 26 20:08:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-57606e01d76059b42d341c615673e01380e28eb8ad62b61df75b9137dd393735-merged.mount: Deactivated successfully.
Jan 26 20:08:38 compute-0 podman[215644]: 2026-01-26 20:08:38.693525599 +0000 UTC m=+0.074965098 container cleanup 429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:08:38 compute-0 systemd[1]: libpod-conmon-429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80.scope: Deactivated successfully.
Jan 26 20:08:38 compute-0 podman[215646]: 2026-01-26 20:08:38.711622336 +0000 UTC m=+0.082149221 container remove 429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.721 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2c4940d1-791f-4204-b12f-32744f28ec6e]: (4, ("Mon Jan 26 08:08:38 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c (429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80)\n429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80\nMon Jan 26 08:08:38 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c (429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80)\n429fab390fc3ef52c620b4285ef5ce57744890d241d1c087898c49fdf8dadc80\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.723 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[7265b8ba-e4c8-4085-a3f0-1e43f8065b1b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.723 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.724 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[82747ae9-2902-49ad-9edb-a1930eaa3e11]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.724 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f030a0-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.727 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:38 compute-0 kernel: tapd9f030a0-20: left promiscuous mode
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.758 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.760 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[85707a72-cbcb-4a7f-a70c-efccd18fad0c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.776 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed82d94-795e-4342-9e25-ca8ff4ef3e22]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.778 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[38de1440-1fe0-498b-b5c0-128ab3d702a4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.806 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[2b8e5ed8-f79c-4d8f-beb5-47b91140f433]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546777, 'reachable_time': 36924, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215692, 'error': None, 'target': 'ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.811 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 20:08:38 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:08:38.811 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae80870-bed6-4878-b3b8-e9de0382cb4e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:08:38 compute-0 systemd[1]: run-netns-ovnmeta\x2dd9f030a0\x2d2e80\x2d4f5c\x2d97ab\x2deb7e0b1edd6c.mount: Deactivated successfully.
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.880 183181 DEBUG nova.compute.manager [req-c1689b60-a053-4b2c-8425-1a91586584d2 req-813d9728-1f68-4f91-ac53-5ab9cb6a3d83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-unplugged-bf218032-dd19-417b-ad93-d29e2b451fce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.881 183181 DEBUG oslo_concurrency.lockutils [req-c1689b60-a053-4b2c-8425-1a91586584d2 req-813d9728-1f68-4f91-ac53-5ab9cb6a3d83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.881 183181 DEBUG oslo_concurrency.lockutils [req-c1689b60-a053-4b2c-8425-1a91586584d2 req-813d9728-1f68-4f91-ac53-5ab9cb6a3d83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.882 183181 DEBUG oslo_concurrency.lockutils [req-c1689b60-a053-4b2c-8425-1a91586584d2 req-813d9728-1f68-4f91-ac53-5ab9cb6a3d83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.882 183181 DEBUG nova.compute.manager [req-c1689b60-a053-4b2c-8425-1a91586584d2 req-813d9728-1f68-4f91-ac53-5ab9cb6a3d83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] No waiting events found dispatching network-vif-unplugged-bf218032-dd19-417b-ad93-d29e2b451fce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:08:38 compute-0 nova_compute[183177]: 2026-01-26 20:08:38.882 183181 DEBUG nova.compute.manager [req-c1689b60-a053-4b2c-8425-1a91586584d2 req-813d9728-1f68-4f91-ac53-5ab9cb6a3d83 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-unplugged-bf218032-dd19-417b-ad93-d29e2b451fce for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.684 183181 DEBUG nova.compute.manager [req-c305cf25-bab8-47ac-a16e-160d0f03bf2e req-50043ebd-acd0-428f-8d7e-4af57c8bf050 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-unplugged-bf218032-dd19-417b-ad93-d29e2b451fce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.685 183181 DEBUG oslo_concurrency.lockutils [req-c305cf25-bab8-47ac-a16e-160d0f03bf2e req-50043ebd-acd0-428f-8d7e-4af57c8bf050 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.685 183181 DEBUG oslo_concurrency.lockutils [req-c305cf25-bab8-47ac-a16e-160d0f03bf2e req-50043ebd-acd0-428f-8d7e-4af57c8bf050 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.686 183181 DEBUG oslo_concurrency.lockutils [req-c305cf25-bab8-47ac-a16e-160d0f03bf2e req-50043ebd-acd0-428f-8d7e-4af57c8bf050 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.686 183181 DEBUG nova.compute.manager [req-c305cf25-bab8-47ac-a16e-160d0f03bf2e req-50043ebd-acd0-428f-8d7e-4af57c8bf050 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] No waiting events found dispatching network-vif-unplugged-bf218032-dd19-417b-ad93-d29e2b451fce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.687 183181 DEBUG nova.compute.manager [req-c305cf25-bab8-47ac-a16e-160d0f03bf2e req-50043ebd-acd0-428f-8d7e-4af57c8bf050 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-unplugged-bf218032-dd19-417b-ad93-d29e2b451fce for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.799 183181 DEBUG nova.network.neutron [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Activated binding for port bf218032-dd19-417b-ad93-d29e2b451fce and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.800 183181 DEBUG nova.compute.manager [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.801 183181 DEBUG nova.virt.libvirt.vif [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:06:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-489971786',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-4899717',id=28,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:06:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8d30bc5631f24a6799364d53cb4e9465',ramdisk_id='',reservation_id='r-45j0gckj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,manager,reader,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-1560930857-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:07:42Z,user_data=None,user_id='b3d5258d30ef4be39230c019f11bed8f',uuid=a29d6012-19fb-48c7-864f-28ed4c715d89,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.801 183181 DEBUG nova.network.os_vif_util [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "bf218032-dd19-417b-ad93-d29e2b451fce", "address": "fa:16:3e:eb:80:11", "network": {"id": "d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-2001678116-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3354485a76de41f592c90f9741b8c515", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf218032-dd", "ovs_interfaceid": "bf218032-dd19-417b-ad93-d29e2b451fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.802 183181 DEBUG nova.network.os_vif_util [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:80:11,bridge_name='br-int',has_traffic_filtering=True,id=bf218032-dd19-417b-ad93-d29e2b451fce,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf218032-dd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.802 183181 DEBUG os_vif [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:80:11,bridge_name='br-int',has_traffic_filtering=True,id=bf218032-dd19-417b-ad93-d29e2b451fce,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf218032-dd') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.806 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.806 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbf218032-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.855 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.856 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.858 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.858 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4f266747-330a-4177-981a-03c3cf44ea0c) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.859 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.861 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.864 183181 INFO os_vif [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:80:11,bridge_name='br-int',has_traffic_filtering=True,id=bf218032-dd19-417b-ad93-d29e2b451fce,network=Network(d9f030a0-2e80-4f5c-97ab-eb7e0b1edd6c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf218032-dd')
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.865 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.865 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.865 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.866 183181 DEBUG nova.compute.manager [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.866 183181 INFO nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Deleting instance files /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89_del
Jan 26 20:08:39 compute-0 nova_compute[183177]: 2026-01-26 20:08:39.868 183181 INFO nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Deletion of /var/lib/nova/instances/a29d6012-19fb-48c7-864f-28ed4c715d89_del complete
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.960 183181 DEBUG nova.compute.manager [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.962 183181 DEBUG oslo_concurrency.lockutils [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.963 183181 DEBUG oslo_concurrency.lockutils [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.964 183181 DEBUG oslo_concurrency.lockutils [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.964 183181 DEBUG nova.compute.manager [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] No waiting events found dispatching network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.965 183181 WARNING nova.compute.manager [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received unexpected event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce for instance with vm_state active and task_state migrating.
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.965 183181 DEBUG nova.compute.manager [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-unplugged-bf218032-dd19-417b-ad93-d29e2b451fce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.966 183181 DEBUG oslo_concurrency.lockutils [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.966 183181 DEBUG oslo_concurrency.lockutils [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.967 183181 DEBUG oslo_concurrency.lockutils [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.968 183181 DEBUG nova.compute.manager [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] No waiting events found dispatching network-vif-unplugged-bf218032-dd19-417b-ad93-d29e2b451fce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.968 183181 DEBUG nova.compute.manager [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-unplugged-bf218032-dd19-417b-ad93-d29e2b451fce for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.969 183181 DEBUG nova.compute.manager [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.969 183181 DEBUG oslo_concurrency.lockutils [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.970 183181 DEBUG oslo_concurrency.lockutils [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.971 183181 DEBUG oslo_concurrency.lockutils [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.971 183181 DEBUG nova.compute.manager [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] No waiting events found dispatching network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.972 183181 WARNING nova.compute.manager [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received unexpected event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce for instance with vm_state active and task_state migrating.
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.972 183181 DEBUG nova.compute.manager [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.973 183181 DEBUG oslo_concurrency.lockutils [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.974 183181 DEBUG oslo_concurrency.lockutils [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.974 183181 DEBUG oslo_concurrency.lockutils [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.975 183181 DEBUG nova.compute.manager [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] No waiting events found dispatching network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:08:40 compute-0 nova_compute[183177]: 2026-01-26 20:08:40.975 183181 WARNING nova.compute.manager [req-c3872615-9960-43cc-a610-4b1d0e74474f req-e24d5535-51f1-4a83-a9f2-9e3112b7a7c6 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Received unexpected event network-vif-plugged-bf218032-dd19-417b-ad93-d29e2b451fce for instance with vm_state active and task_state migrating.
Jan 26 20:08:43 compute-0 podman[215695]: 2026-01-26 20:08:43.334191885 +0000 UTC m=+0.055491255 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 20:08:43 compute-0 podman[215693]: 2026-01-26 20:08:43.379505565 +0000 UTC m=+0.124997146 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 26 20:08:43 compute-0 podman[215694]: 2026-01-26 20:08:43.391594861 +0000 UTC m=+0.125148691 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible)
Jan 26 20:08:43 compute-0 nova_compute[183177]: 2026-01-26 20:08:43.672 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:44 compute-0 nova_compute[183177]: 2026-01-26 20:08:44.861 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:48 compute-0 nova_compute[183177]: 2026-01-26 20:08:48.710 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:49 compute-0 nova_compute[183177]: 2026-01-26 20:08:49.419 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:49 compute-0 nova_compute[183177]: 2026-01-26 20:08:49.420 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:49 compute-0 nova_compute[183177]: 2026-01-26 20:08:49.420 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a29d6012-19fb-48c7-864f-28ed4c715d89-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:49 compute-0 nova_compute[183177]: 2026-01-26 20:08:49.904 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:49 compute-0 nova_compute[183177]: 2026-01-26 20:08:49.940 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:49 compute-0 nova_compute[183177]: 2026-01-26 20:08:49.941 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:49 compute-0 nova_compute[183177]: 2026-01-26 20:08:49.942 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:49 compute-0 nova_compute[183177]: 2026-01-26 20:08:49.942 183181 DEBUG nova.compute.resource_tracker [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:08:50 compute-0 podman[215762]: 2026-01-26 20:08:50.10684725 +0000 UTC m=+0.106330973 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:08:50 compute-0 nova_compute[183177]: 2026-01-26 20:08:50.217 183181 WARNING nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:08:50 compute-0 nova_compute[183177]: 2026-01-26 20:08:50.220 183181 DEBUG oslo_concurrency.processutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:08:50 compute-0 nova_compute[183177]: 2026-01-26 20:08:50.256 183181 DEBUG oslo_concurrency.processutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:08:50 compute-0 nova_compute[183177]: 2026-01-26 20:08:50.257 183181 DEBUG nova.compute.resource_tracker [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5750MB free_disk=73.09015655517578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:08:50 compute-0 nova_compute[183177]: 2026-01-26 20:08:50.258 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:08:50 compute-0 nova_compute[183177]: 2026-01-26 20:08:50.258 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:08:51 compute-0 nova_compute[183177]: 2026-01-26 20:08:51.280 183181 DEBUG nova.compute.resource_tracker [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance a29d6012-19fb-48c7-864f-28ed4c715d89 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 20:08:51 compute-0 nova_compute[183177]: 2026-01-26 20:08:51.790 183181 DEBUG nova.compute.resource_tracker [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 20:08:51 compute-0 nova_compute[183177]: 2026-01-26 20:08:51.834 183181 DEBUG nova.compute.resource_tracker [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration b2f8df64-c128-4969-b068-1d1aa1bed397 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:08:51 compute-0 nova_compute[183177]: 2026-01-26 20:08:51.835 183181 DEBUG nova.compute.resource_tracker [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:08:51 compute-0 nova_compute[183177]: 2026-01-26 20:08:51.836 183181 DEBUG nova.compute.resource_tracker [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:08:50 up  1:33,  0 user,  load average: 0.27, 0.30, 0.28\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:08:51 compute-0 nova_compute[183177]: 2026-01-26 20:08:51.892 183181 DEBUG nova.compute.provider_tree [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:08:52 compute-0 nova_compute[183177]: 2026-01-26 20:08:52.402 183181 DEBUG nova.scheduler.client.report [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:08:52 compute-0 nova_compute[183177]: 2026-01-26 20:08:52.919 183181 DEBUG nova.compute.resource_tracker [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:08:52 compute-0 nova_compute[183177]: 2026-01-26 20:08:52.921 183181 DEBUG oslo_concurrency.lockutils [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.662s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:08:52 compute-0 nova_compute[183177]: 2026-01-26 20:08:52.949 183181 INFO nova.compute.manager [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 26 20:08:53 compute-0 nova_compute[183177]: 2026-01-26 20:08:53.713 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:54 compute-0 nova_compute[183177]: 2026-01-26 20:08:54.040 183181 INFO nova.scheduler.client.report [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration b2f8df64-c128-4969-b068-1d1aa1bed397
Jan 26 20:08:54 compute-0 nova_compute[183177]: 2026-01-26 20:08:54.041 183181 DEBUG nova.virt.libvirt.driver [None req-b001ea9f-05ca-48c5-949f-54917cb14818 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a29d6012-19fb-48c7-864f-28ed4c715d89] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 20:08:54 compute-0 nova_compute[183177]: 2026-01-26 20:08:54.911 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:58 compute-0 sshd-session[215789]: Connection closed by authenticating user root 188.166.116.149 port 57984 [preauth]
Jan 26 20:08:58 compute-0 nova_compute[183177]: 2026-01-26 20:08:58.730 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:08:59 compute-0 podman[192499]: time="2026-01-26T20:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:08:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:08:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Jan 26 20:08:59 compute-0 nova_compute[183177]: 2026-01-26 20:08:59.952 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:01 compute-0 openstack_network_exporter[195363]: ERROR   20:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:09:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:09:01 compute-0 openstack_network_exporter[195363]: ERROR   20:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:09:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:09:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:02.707 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:09:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:02.708 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:09:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:02.709 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:09:02 compute-0 nova_compute[183177]: 2026-01-26 20:09:02.709 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:03 compute-0 nova_compute[183177]: 2026-01-26 20:09:03.782 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:04 compute-0 sshd-session[215792]: Connection closed by authenticating user root 142.93.140.142 port 47882 [preauth]
Jan 26 20:09:04 compute-0 nova_compute[183177]: 2026-01-26 20:09:04.660 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:09:04 compute-0 nova_compute[183177]: 2026-01-26 20:09:04.971 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:08 compute-0 nova_compute[183177]: 2026-01-26 20:09:08.786 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:09 compute-0 nova_compute[183177]: 2026-01-26 20:09:09.635 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:10 compute-0 nova_compute[183177]: 2026-01-26 20:09:10.011 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:11 compute-0 nova_compute[183177]: 2026-01-26 20:09:11.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:09:13 compute-0 nova_compute[183177]: 2026-01-26 20:09:13.824 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:14 compute-0 podman[215796]: 2026-01-26 20:09:14.345654961 +0000 UTC m=+0.075060312 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:09:14 compute-0 podman[215795]: 2026-01-26 20:09:14.349417421 +0000 UTC m=+0.085464971 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 20:09:14 compute-0 podman[215794]: 2026-01-26 20:09:14.392263065 +0000 UTC m=+0.132819106 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 20:09:15 compute-0 nova_compute[183177]: 2026-01-26 20:09:15.014 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.673 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.673 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.674 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.869 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.870 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.905 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.905 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5756MB free_disk=73.09015655517578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.906 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:09:16 compute-0 nova_compute[183177]: 2026-01-26 20:09:16.906 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:09:17 compute-0 nova_compute[183177]: 2026-01-26 20:09:17.976 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:09:17 compute-0 nova_compute[183177]: 2026-01-26 20:09:17.976 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:09:16 up  1:33,  0 user,  load average: 0.17, 0.28, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:09:18 compute-0 nova_compute[183177]: 2026-01-26 20:09:18.179 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:09:18 compute-0 nova_compute[183177]: 2026-01-26 20:09:18.686 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:09:18 compute-0 nova_compute[183177]: 2026-01-26 20:09:18.825 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:19.166 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:8a:fc 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4d84bb16e02476fb48c432b9e91f9fe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03a6e965-0f13-4adb-a0dd-8b518d1d2445, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f7d8ca99-79c7-4c8b-b8a1-edbf275471a0) old=Port_Binding(mac=['fa:16:3e:1d:8a:fc'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4d84bb16e02476fb48c432b9e91f9fe', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:09:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:19.168 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f7d8ca99-79c7-4c8b-b8a1-edbf275471a0 in datapath bbde741b-e853-4ad9-b0df-87ed33f347f8 updated
Jan 26 20:09:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:19.168 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bbde741b-e853-4ad9-b0df-87ed33f347f8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 20:09:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:19.170 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[17fa9cc0-12af-46f7-acfa-b65c1fd84141]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:19 compute-0 nova_compute[183177]: 2026-01-26 20:09:19.198 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:09:19 compute-0 nova_compute[183177]: 2026-01-26 20:09:19.198 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.292s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:09:20 compute-0 nova_compute[183177]: 2026-01-26 20:09:20.019 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:20 compute-0 podman[215858]: 2026-01-26 20:09:20.357348421 +0000 UTC m=+0.089814768 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 20:09:21 compute-0 nova_compute[183177]: 2026-01-26 20:09:21.199 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:09:21 compute-0 nova_compute[183177]: 2026-01-26 20:09:21.199 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:09:23 compute-0 nova_compute[183177]: 2026-01-26 20:09:23.829 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:24.107 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:09:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:24.108 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:09:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:24.108 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:09:24 compute-0 nova_compute[183177]: 2026-01-26 20:09:24.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:09:25 compute-0 nova_compute[183177]: 2026-01-26 20:09:25.022 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:25 compute-0 nova_compute[183177]: 2026-01-26 20:09:25.741 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:09:28 compute-0 nova_compute[183177]: 2026-01-26 20:09:28.830 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:29 compute-0 podman[192499]: time="2026-01-26T20:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:09:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:09:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Jan 26 20:09:30 compute-0 nova_compute[183177]: 2026-01-26 20:09:30.049 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:31 compute-0 openstack_network_exporter[195363]: ERROR   20:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:09:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:09:31 compute-0 openstack_network_exporter[195363]: ERROR   20:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:09:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:09:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:31.585 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:2c:5a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-177c2a10-700d-43a3-bbee-7e082cfe9ee4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-177c2a10-700d-43a3-bbee-7e082cfe9ee4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '393f035f3d824babb9d76f6e83e4192b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=73acaf79-484c-41aa-a3e1-bd9eb1e76f77, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3fc1c277-63c8-4700-b7ef-385ab00fe105) old=Port_Binding(mac=['fa:16:3e:5e:2c:5a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-177c2a10-700d-43a3-bbee-7e082cfe9ee4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-177c2a10-700d-43a3-bbee-7e082cfe9ee4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '393f035f3d824babb9d76f6e83e4192b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:09:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:31.587 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3fc1c277-63c8-4700-b7ef-385ab00fe105 in datapath 177c2a10-700d-43a3-bbee-7e082cfe9ee4 updated
Jan 26 20:09:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:31.587 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 177c2a10-700d-43a3-bbee-7e082cfe9ee4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 20:09:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:31.588 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d34cbe87-c8fb-447b-a1bf-7f723a186d38]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:33 compute-0 nova_compute[183177]: 2026-01-26 20:09:33.877 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:35 compute-0 nova_compute[183177]: 2026-01-26 20:09:35.052 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:36 compute-0 sshd-session[215885]: Connection closed by authenticating user root 188.166.116.149 port 59502 [preauth]
Jan 26 20:09:38 compute-0 nova_compute[183177]: 2026-01-26 20:09:38.877 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:40 compute-0 nova_compute[183177]: 2026-01-26 20:09:40.101 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:41 compute-0 sshd-session[215887]: Connection closed by authenticating user root 142.93.140.142 port 45270 [preauth]
Jan 26 20:09:42 compute-0 ovn_controller[95396]: 2026-01-26T20:09:42Z|00231|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 26 20:09:43 compute-0 nova_compute[183177]: 2026-01-26 20:09:43.909 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:44 compute-0 nova_compute[183177]: 2026-01-26 20:09:44.881 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "b427e269-a2ae-4c99-b118-5a532d52b29d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:09:44 compute-0 nova_compute[183177]: 2026-01-26 20:09:44.882 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:09:45 compute-0 nova_compute[183177]: 2026-01-26 20:09:45.143 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:45 compute-0 podman[215890]: 2026-01-26 20:09:45.362267487 +0000 UTC m=+0.094559287 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 26 20:09:45 compute-0 podman[215891]: 2026-01-26 20:09:45.384576768 +0000 UTC m=+0.114525764 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 20:09:45 compute-0 nova_compute[183177]: 2026-01-26 20:09:45.389 183181 DEBUG nova.compute.manager [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 20:09:45 compute-0 podman[215889]: 2026-01-26 20:09:45.391569676 +0000 UTC m=+0.129987090 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 26 20:09:45 compute-0 nova_compute[183177]: 2026-01-26 20:09:45.955 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:09:45 compute-0 nova_compute[183177]: 2026-01-26 20:09:45.955 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:09:45 compute-0 nova_compute[183177]: 2026-01-26 20:09:45.965 183181 DEBUG nova.virt.hardware [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 20:09:45 compute-0 nova_compute[183177]: 2026-01-26 20:09:45.965 183181 INFO nova.compute.claims [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Claim successful on node compute-0.ctlplane.example.com
Jan 26 20:09:47 compute-0 nova_compute[183177]: 2026-01-26 20:09:47.119 183181 DEBUG nova.compute.provider_tree [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:09:47 compute-0 nova_compute[183177]: 2026-01-26 20:09:47.626 183181 DEBUG nova.scheduler.client.report [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:09:48 compute-0 nova_compute[183177]: 2026-01-26 20:09:48.142 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.186s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:09:48 compute-0 nova_compute[183177]: 2026-01-26 20:09:48.143 183181 DEBUG nova.compute.manager [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 20:09:48 compute-0 nova_compute[183177]: 2026-01-26 20:09:48.661 183181 DEBUG nova.compute.manager [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 20:09:48 compute-0 nova_compute[183177]: 2026-01-26 20:09:48.661 183181 DEBUG nova.network.neutron [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 20:09:48 compute-0 nova_compute[183177]: 2026-01-26 20:09:48.662 183181 WARNING neutronclient.v2_0.client [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:09:48 compute-0 nova_compute[183177]: 2026-01-26 20:09:48.663 183181 WARNING neutronclient.v2_0.client [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:09:48 compute-0 nova_compute[183177]: 2026-01-26 20:09:48.910 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:49 compute-0 nova_compute[183177]: 2026-01-26 20:09:49.172 183181 INFO nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 20:09:49 compute-0 nova_compute[183177]: 2026-01-26 20:09:49.322 183181 DEBUG nova.network.neutron [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Successfully created port: 485296a7-5a9b-4358-a4ae-28d57a2e471f _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 20:09:49 compute-0 nova_compute[183177]: 2026-01-26 20:09:49.681 183181 DEBUG nova.compute.manager [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.114 183181 DEBUG nova.network.neutron [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Successfully updated port: 485296a7-5a9b-4358-a4ae-28d57a2e471f _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.146 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.427 183181 DEBUG nova.compute.manager [req-9adcd097-f81f-4d79-8151-928a48ac6984 req-1a1057c4-a7ca-43d1-ace8-6cb22cfbc5f5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received event network-changed-485296a7-5a9b-4358-a4ae-28d57a2e471f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.427 183181 DEBUG nova.compute.manager [req-9adcd097-f81f-4d79-8151-928a48ac6984 req-1a1057c4-a7ca-43d1-ace8-6cb22cfbc5f5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Refreshing instance network info cache due to event network-changed-485296a7-5a9b-4358-a4ae-28d57a2e471f. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.427 183181 DEBUG oslo_concurrency.lockutils [req-9adcd097-f81f-4d79-8151-928a48ac6984 req-1a1057c4-a7ca-43d1-ace8-6cb22cfbc5f5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-b427e269-a2ae-4c99-b118-5a532d52b29d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.427 183181 DEBUG oslo_concurrency.lockutils [req-9adcd097-f81f-4d79-8151-928a48ac6984 req-1a1057c4-a7ca-43d1-ace8-6cb22cfbc5f5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-b427e269-a2ae-4c99-b118-5a532d52b29d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.428 183181 DEBUG nova.network.neutron [req-9adcd097-f81f-4d79-8151-928a48ac6984 req-1a1057c4-a7ca-43d1-ace8-6cb22cfbc5f5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Refreshing network info cache for port 485296a7-5a9b-4358-a4ae-28d57a2e471f _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.619 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "refresh_cache-b427e269-a2ae-4c99-b118-5a532d52b29d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.707 183181 DEBUG nova.compute.manager [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.709 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.710 183181 INFO nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Creating image(s)
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.712 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.713 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.715 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.716 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.724 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.727 183181 DEBUG oslo_concurrency.processutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.814 183181 DEBUG oslo_concurrency.processutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.815 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.816 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.817 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.824 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.825 183181 DEBUG oslo_concurrency.processutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.879 183181 DEBUG oslo_concurrency.processutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.881 183181 DEBUG oslo_concurrency.processutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.929 183181 DEBUG oslo_concurrency.processutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.930 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.931 183181 DEBUG oslo_concurrency.processutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.942 183181 WARNING neutronclient.v2_0.client [req-9adcd097-f81f-4d79-8151-928a48ac6984 req-1a1057c4-a7ca-43d1-ace8-6cb22cfbc5f5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.983 183181 DEBUG oslo_concurrency.processutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.984 183181 DEBUG nova.virt.disk.api [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Checking if we can resize image /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 20:09:50 compute-0 nova_compute[183177]: 2026-01-26 20:09:50.985 183181 DEBUG oslo_concurrency.processutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:09:51 compute-0 nova_compute[183177]: 2026-01-26 20:09:51.073 183181 DEBUG oslo_concurrency.processutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:09:51 compute-0 nova_compute[183177]: 2026-01-26 20:09:51.074 183181 DEBUG nova.virt.disk.api [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Cannot resize image /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 20:09:51 compute-0 nova_compute[183177]: 2026-01-26 20:09:51.075 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 20:09:51 compute-0 nova_compute[183177]: 2026-01-26 20:09:51.076 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Ensure instance console log exists: /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 20:09:51 compute-0 nova_compute[183177]: 2026-01-26 20:09:51.076 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:09:51 compute-0 nova_compute[183177]: 2026-01-26 20:09:51.077 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:09:51 compute-0 nova_compute[183177]: 2026-01-26 20:09:51.077 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:09:51 compute-0 podman[215968]: 2026-01-26 20:09:51.301173169 +0000 UTC m=+0.054401546 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 20:09:51 compute-0 nova_compute[183177]: 2026-01-26 20:09:51.451 183181 DEBUG nova.network.neutron [req-9adcd097-f81f-4d79-8151-928a48ac6984 req-1a1057c4-a7ca-43d1-ace8-6cb22cfbc5f5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:09:51 compute-0 nova_compute[183177]: 2026-01-26 20:09:51.645 183181 DEBUG nova.network.neutron [req-9adcd097-f81f-4d79-8151-928a48ac6984 req-1a1057c4-a7ca-43d1-ace8-6cb22cfbc5f5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:09:52 compute-0 nova_compute[183177]: 2026-01-26 20:09:52.161 183181 DEBUG oslo_concurrency.lockutils [req-9adcd097-f81f-4d79-8151-928a48ac6984 req-1a1057c4-a7ca-43d1-ace8-6cb22cfbc5f5 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-b427e269-a2ae-4c99-b118-5a532d52b29d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:09:52 compute-0 nova_compute[183177]: 2026-01-26 20:09:52.162 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquired lock "refresh_cache-b427e269-a2ae-4c99-b118-5a532d52b29d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:09:52 compute-0 nova_compute[183177]: 2026-01-26 20:09:52.162 183181 DEBUG nova.network.neutron [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 20:09:53 compute-0 nova_compute[183177]: 2026-01-26 20:09:53.473 183181 DEBUG nova.network.neutron [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:09:53 compute-0 nova_compute[183177]: 2026-01-26 20:09:53.808 183181 WARNING neutronclient.v2_0.client [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:09:53 compute-0 nova_compute[183177]: 2026-01-26 20:09:53.913 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.020 183181 DEBUG nova.network.neutron [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Updating instance_info_cache with network_info: [{"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.528 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Releasing lock "refresh_cache-b427e269-a2ae-4c99-b118-5a532d52b29d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.529 183181 DEBUG nova.compute.manager [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Instance network_info: |[{"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.533 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Start _get_guest_xml network_info=[{"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.539 183181 WARNING nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.541 183181 DEBUG nova.virt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-1031211870', uuid='b427e269-a2ae-4c99-b118-5a532d52b29d'), owner=OwnerMeta(userid='7a3a0c805ad14e438b8e8a90e16d8d02', username='tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin', projectid='393f035f3d824babb9d76f6e83e4192b', projectname='tempest-TestExecuteZoneMigrationStrategy-1136104294'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769458194.5414011) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.547 183181 DEBUG nova.virt.libvirt.host [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.548 183181 DEBUG nova.virt.libvirt.host [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.551 183181 DEBUG nova.virt.libvirt.host [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.552 183181 DEBUG nova.virt.libvirt.host [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.554 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.554 183181 DEBUG nova.virt.hardware [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.555 183181 DEBUG nova.virt.hardware [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.555 183181 DEBUG nova.virt.hardware [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.555 183181 DEBUG nova.virt.hardware [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.556 183181 DEBUG nova.virt.hardware [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.556 183181 DEBUG nova.virt.hardware [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.556 183181 DEBUG nova.virt.hardware [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.557 183181 DEBUG nova.virt.hardware [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.557 183181 DEBUG nova.virt.hardware [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.557 183181 DEBUG nova.virt.hardware [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.557 183181 DEBUG nova.virt.hardware [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.564 183181 DEBUG nova.virt.libvirt.vif [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:09:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1031211870',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1031211870',id=30,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='393f035f3d824babb9d76f6e83e4192b',ramdisk_id='',reservation_id='r-otl3sowe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1136104294',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:09:49Z,user_data=None,user_id='7a3a0c805ad14e438b8e8a90e16d8d02',uuid=b427e269-a2ae-4c99-b118-5a532d52b29d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.564 183181 DEBUG nova.network.os_vif_util [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Converting VIF {"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.566 183181 DEBUG nova.network.os_vif_util [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a4:a0,bridge_name='br-int',has_traffic_filtering=True,id=485296a7-5a9b-4358-a4ae-28d57a2e471f,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485296a7-5a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:09:54 compute-0 nova_compute[183177]: 2026-01-26 20:09:54.567 183181 DEBUG nova.objects.instance [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lazy-loading 'pci_devices' on Instance uuid b427e269-a2ae-4c99-b118-5a532d52b29d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.084 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] End _get_guest_xml xml=<domain type="kvm">
Jan 26 20:09:55 compute-0 nova_compute[183177]:   <uuid>b427e269-a2ae-4c99-b118-5a532d52b29d</uuid>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   <name>instance-0000001e</name>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1031211870</nova:name>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:09:54</nova:creationTime>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:09:55 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:09:55 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:user uuid="7a3a0c805ad14e438b8e8a90e16d8d02">tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin</nova:user>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:project uuid="393f035f3d824babb9d76f6e83e4192b">tempest-TestExecuteZoneMigrationStrategy-1136104294</nova:project>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         <nova:port uuid="485296a7-5a9b-4358-a4ae-28d57a2e471f">
Jan 26 20:09:55 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <system>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <entry name="serial">b427e269-a2ae-4c99-b118-5a532d52b29d</entry>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <entry name="uuid">b427e269-a2ae-4c99-b118-5a532d52b29d</entry>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     </system>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   <os>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   </os>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   <features>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   </features>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk.config"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:e4:a4:a0"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <target dev="tap485296a7-5a"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     </interface>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/console.log" append="off"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <video>
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     </video>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:09:55 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:09:55 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:09:55 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:09:55 compute-0 nova_compute[183177]: </domain>
Jan 26 20:09:55 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.086 183181 DEBUG nova.compute.manager [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Preparing to wait for external event network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.087 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.087 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.088 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.089 183181 DEBUG nova.virt.libvirt.vif [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:09:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1031211870',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1031211870',id=30,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='393f035f3d824babb9d76f6e83e4192b',ramdisk_id='',reservation_id='r-otl3sowe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1136104294',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:09:49Z,user_data=None,user_id='7a3a0c805ad14e438b8e8a90e16d8d02',uuid=b427e269-a2ae-4c99-b118-5a532d52b29d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.090 183181 DEBUG nova.network.os_vif_util [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Converting VIF {"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.091 183181 DEBUG nova.network.os_vif_util [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a4:a0,bridge_name='br-int',has_traffic_filtering=True,id=485296a7-5a9b-4358-a4ae-28d57a2e471f,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485296a7-5a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.092 183181 DEBUG os_vif [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a4:a0,bridge_name='br-int',has_traffic_filtering=True,id=485296a7-5a9b-4358-a4ae-28d57a2e471f,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485296a7-5a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.093 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.094 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.095 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.096 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.097 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c4b1f8b7-74e6-5ed4-9112-d8565156d794', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.099 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.101 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.104 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.104 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap485296a7-5a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.105 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap485296a7-5a, col_values=(('qos', UUID('e31d578b-653d-4752-90bd-fcb4d7172e50')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.106 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap485296a7-5a, col_values=(('external_ids', {'iface-id': '485296a7-5a9b-4358-a4ae-28d57a2e471f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:a4:a0', 'vm-uuid': 'b427e269-a2ae-4c99-b118-5a532d52b29d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:09:55 compute-0 NetworkManager[55489]: <info>  [1769458195.1097] manager: (tap485296a7-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.110 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.113 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.116 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:55 compute-0 nova_compute[183177]: 2026-01-26 20:09:55.117 183181 INFO os_vif [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a4:a0,bridge_name='br-int',has_traffic_filtering=True,id=485296a7-5a9b-4358-a4ae-28d57a2e471f,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485296a7-5a')
Jan 26 20:09:56 compute-0 nova_compute[183177]: 2026-01-26 20:09:56.661 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:09:56 compute-0 nova_compute[183177]: 2026-01-26 20:09:56.661 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:09:56 compute-0 nova_compute[183177]: 2026-01-26 20:09:56.662 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] No VIF found with MAC fa:16:3e:e4:a4:a0, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 20:09:56 compute-0 nova_compute[183177]: 2026-01-26 20:09:56.663 183181 INFO nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Using config drive
Jan 26 20:09:57 compute-0 nova_compute[183177]: 2026-01-26 20:09:57.176 183181 WARNING neutronclient.v2_0.client [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:09:57 compute-0 nova_compute[183177]: 2026-01-26 20:09:57.344 183181 INFO nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Creating config drive at /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk.config
Jan 26 20:09:57 compute-0 nova_compute[183177]: 2026-01-26 20:09:57.348 183181 DEBUG oslo_concurrency.processutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpvpvr5m8b execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:09:57 compute-0 nova_compute[183177]: 2026-01-26 20:09:57.483 183181 DEBUG oslo_concurrency.processutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpvpvr5m8b" returned: 0 in 0.135s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:09:57 compute-0 kernel: tap485296a7-5a: entered promiscuous mode
Jan 26 20:09:57 compute-0 NetworkManager[55489]: <info>  [1769458197.5921] manager: (tap485296a7-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Jan 26 20:09:57 compute-0 ovn_controller[95396]: 2026-01-26T20:09:57Z|00232|binding|INFO|Claiming lport 485296a7-5a9b-4358-a4ae-28d57a2e471f for this chassis.
Jan 26 20:09:57 compute-0 ovn_controller[95396]: 2026-01-26T20:09:57Z|00233|binding|INFO|485296a7-5a9b-4358-a4ae-28d57a2e471f: Claiming fa:16:3e:e4:a4:a0 10.100.0.3
Jan 26 20:09:57 compute-0 nova_compute[183177]: 2026-01-26 20:09:57.592 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:57 compute-0 nova_compute[183177]: 2026-01-26 20:09:57.601 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:57 compute-0 systemd-machined[154465]: New machine qemu-23-instance-0000001e.
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.620 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:a4:a0 10.100.0.3'], port_security=['fa:16:3e:e4:a4:a0 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b427e269-a2ae-4c99-b118-5a532d52b29d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '393f035f3d824babb9d76f6e83e4192b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '00852c1f-7776-4984-b01a-e3f5f5d3f1d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03a6e965-0f13-4adb-a0dd-8b518d1d2445, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=485296a7-5a9b-4358-a4ae-28d57a2e471f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.620 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 485296a7-5a9b-4358-a4ae-28d57a2e471f in datapath bbde741b-e853-4ad9-b0df-87ed33f347f8 bound to our chassis
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.622 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bbde741b-e853-4ad9-b0df-87ed33f347f8
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.635 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[51044164-cb0d-40ba-9c92-a8f09dfe6a3d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.636 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbbde741b-e1 in ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.638 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbbde741b-e0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.638 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e2865340-9bcb-46db-9508-6e018a651cb7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.638 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f174d83f-7fb5-4fe3-a34d-27199daa510f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.649 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[c462f155-1572-4c63-86dd-d171a6d4a71d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-0000001e.
Jan 26 20:09:57 compute-0 systemd-udevd[216015]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.673 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6d311ae6-e5ae-4de7-9619-265ac65c82ae]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_controller[95396]: 2026-01-26T20:09:57Z|00234|binding|INFO|Setting lport 485296a7-5a9b-4358-a4ae-28d57a2e471f ovn-installed in OVS
Jan 26 20:09:57 compute-0 ovn_controller[95396]: 2026-01-26T20:09:57Z|00235|binding|INFO|Setting lport 485296a7-5a9b-4358-a4ae-28d57a2e471f up in Southbound
Jan 26 20:09:57 compute-0 nova_compute[183177]: 2026-01-26 20:09:57.675 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:57 compute-0 NetworkManager[55489]: <info>  [1769458197.6862] device (tap485296a7-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 20:09:57 compute-0 NetworkManager[55489]: <info>  [1769458197.6873] device (tap485296a7-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.710 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[8c4b39ad-8bbb-4a1c-b59b-97c8b93871b6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.716 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[fecc61d4-855e-4089-9f01-f2efd5570b11]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 systemd-udevd[216022]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:09:57 compute-0 NetworkManager[55489]: <info>  [1769458197.7186] manager: (tapbbde741b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/94)
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.749 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[ea7d4780-b789-4d53-9b1b-bb5155573f26]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.752 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[79757d78-2dac-42b6-9af8-390ee91d7ddc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 NetworkManager[55489]: <info>  [1769458197.7735] device (tapbbde741b-e0): carrier: link connected
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.780 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[b1cc5eac-05f9-46e3-8122-b36800aa1d00]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.803 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[7089e2d9-1c0a-42ee-9866-8c884bd0bf5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbbde741b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:8a:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565666, 'reachable_time': 27007, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216045, 'error': None, 'target': 'ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.818 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6dbc3e6a-2133-465c-998a-87389562ef8c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:8afc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565666, 'tstamp': 565666}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216046, 'error': None, 'target': 'ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.839 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6823205d-1ae5-462e-b2fc-7aecb10097a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbbde741b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:8a:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565666, 'reachable_time': 27007, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216047, 'error': None, 'target': 'ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.875 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[51e6bde9-dbbb-472a-b9db-a8d9c33cbcb4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.941 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a97e2da3-bf69-41a7-9f39-62ba456da94f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.942 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbbde741b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.942 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.942 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbbde741b-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:09:57 compute-0 NetworkManager[55489]: <info>  [1769458197.9444] manager: (tapbbde741b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Jan 26 20:09:57 compute-0 nova_compute[183177]: 2026-01-26 20:09:57.944 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:57 compute-0 kernel: tapbbde741b-e0: entered promiscuous mode
Jan 26 20:09:57 compute-0 nova_compute[183177]: 2026-01-26 20:09:57.946 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.947 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbbde741b-e0, col_values=(('external_ids', {'iface-id': 'f7d8ca99-79c7-4c8b-b8a1-edbf275471a0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:09:57 compute-0 ovn_controller[95396]: 2026-01-26T20:09:57Z|00236|binding|INFO|Releasing lport f7d8ca99-79c7-4c8b-b8a1-edbf275471a0 from this chassis (sb_readonly=0)
Jan 26 20:09:57 compute-0 nova_compute[183177]: 2026-01-26 20:09:57.970 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.971 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[686c9037-1e53-481b-95f7-7e1aa0d10bfd]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.972 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.972 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.972 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for bbde741b-e853-4ad9-b0df-87ed33f347f8 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.972 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.972 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e97088f8-e0a4-4bbf-bbb9-06d4ca12007e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.973 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.973 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9f81ed6e-2d0a-4fda-8a2d-9e9e96902bef]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.973 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: global
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-bbde741b-e853-4ad9-b0df-87ed33f347f8
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID bbde741b-e853-4ad9-b0df-87ed33f347f8
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 20:09:57 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:09:57.974 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'env', 'PROCESS_TAG=haproxy-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bbde741b-e853-4ad9-b0df-87ed33f347f8.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 20:09:58 compute-0 podman[216079]: 2026-01-26 20:09:58.421379428 +0000 UTC m=+0.060957632 container create 35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Jan 26 20:09:58 compute-0 systemd[1]: Started libpod-conmon-35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f.scope.
Jan 26 20:09:58 compute-0 podman[216079]: 2026-01-26 20:09:58.387332852 +0000 UTC m=+0.026911126 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 20:09:58 compute-0 systemd[1]: Started libcrun container.
Jan 26 20:09:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb82d917dda3d8f0c19cb564dc4407919421071fe5c5cb22a1c518c404de57eb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 20:09:58 compute-0 podman[216079]: 2026-01-26 20:09:58.539124907 +0000 UTC m=+0.178703121 container init 35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Jan 26 20:09:58 compute-0 podman[216079]: 2026-01-26 20:09:58.54591467 +0000 UTC m=+0.185492864 container start 35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 26 20:09:58 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216094]: [NOTICE]   (216098) : New worker (216100) forked
Jan 26 20:09:58 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216094]: [NOTICE]   (216098) : Loading success.
Jan 26 20:09:58 compute-0 nova_compute[183177]: 2026-01-26 20:09:58.679 183181 DEBUG nova.compute.manager [req-53a6e413-23d1-49fe-b72d-90f4c45b7304 req-61e48f82-8f15-4ba9-9ec5-4a69fc062e22 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received event network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:09:58 compute-0 nova_compute[183177]: 2026-01-26 20:09:58.681 183181 DEBUG oslo_concurrency.lockutils [req-53a6e413-23d1-49fe-b72d-90f4c45b7304 req-61e48f82-8f15-4ba9-9ec5-4a69fc062e22 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:09:58 compute-0 nova_compute[183177]: 2026-01-26 20:09:58.682 183181 DEBUG oslo_concurrency.lockutils [req-53a6e413-23d1-49fe-b72d-90f4c45b7304 req-61e48f82-8f15-4ba9-9ec5-4a69fc062e22 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:09:58 compute-0 nova_compute[183177]: 2026-01-26 20:09:58.683 183181 DEBUG oslo_concurrency.lockutils [req-53a6e413-23d1-49fe-b72d-90f4c45b7304 req-61e48f82-8f15-4ba9-9ec5-4a69fc062e22 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:09:58 compute-0 nova_compute[183177]: 2026-01-26 20:09:58.683 183181 DEBUG nova.compute.manager [req-53a6e413-23d1-49fe-b72d-90f4c45b7304 req-61e48f82-8f15-4ba9-9ec5-4a69fc062e22 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Processing event network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:09:58 compute-0 nova_compute[183177]: 2026-01-26 20:09:58.729 183181 DEBUG nova.compute.manager [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:09:58 compute-0 nova_compute[183177]: 2026-01-26 20:09:58.735 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 20:09:58 compute-0 nova_compute[183177]: 2026-01-26 20:09:58.740 183181 INFO nova.virt.libvirt.driver [-] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Instance spawned successfully.
Jan 26 20:09:58 compute-0 nova_compute[183177]: 2026-01-26 20:09:58.740 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 20:09:58 compute-0 nova_compute[183177]: 2026-01-26 20:09:58.915 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:09:59 compute-0 nova_compute[183177]: 2026-01-26 20:09:59.253 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:09:59 compute-0 nova_compute[183177]: 2026-01-26 20:09:59.254 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:09:59 compute-0 nova_compute[183177]: 2026-01-26 20:09:59.254 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:09:59 compute-0 nova_compute[183177]: 2026-01-26 20:09:59.255 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:09:59 compute-0 nova_compute[183177]: 2026-01-26 20:09:59.255 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:09:59 compute-0 nova_compute[183177]: 2026-01-26 20:09:59.256 183181 DEBUG nova.virt.libvirt.driver [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:09:59 compute-0 podman[192499]: time="2026-01-26T20:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:09:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:09:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Jan 26 20:09:59 compute-0 nova_compute[183177]: 2026-01-26 20:09:59.766 183181 INFO nova.compute.manager [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Took 9.06 seconds to spawn the instance on the hypervisor.
Jan 26 20:09:59 compute-0 nova_compute[183177]: 2026-01-26 20:09:59.767 183181 DEBUG nova.compute.manager [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 20:10:00 compute-0 nova_compute[183177]: 2026-01-26 20:10:00.109 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:00 compute-0 nova_compute[183177]: 2026-01-26 20:10:00.306 183181 INFO nova.compute.manager [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Took 14.40 seconds to build instance.
Jan 26 20:10:00 compute-0 sshd-session[216116]: Invalid user oracle from 193.32.162.151 port 37528
Jan 26 20:10:00 compute-0 sshd-session[216116]: Connection closed by invalid user oracle 193.32.162.151 port 37528 [preauth]
Jan 26 20:10:00 compute-0 nova_compute[183177]: 2026-01-26 20:10:00.739 183181 DEBUG nova.compute.manager [req-d7552371-00fe-42ea-a3c7-9e4ef862bff2 req-31156eac-7a9e-4bb5-99df-a29c4115ce41 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received event network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:10:00 compute-0 nova_compute[183177]: 2026-01-26 20:10:00.739 183181 DEBUG oslo_concurrency.lockutils [req-d7552371-00fe-42ea-a3c7-9e4ef862bff2 req-31156eac-7a9e-4bb5-99df-a29c4115ce41 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:10:00 compute-0 nova_compute[183177]: 2026-01-26 20:10:00.739 183181 DEBUG oslo_concurrency.lockutils [req-d7552371-00fe-42ea-a3c7-9e4ef862bff2 req-31156eac-7a9e-4bb5-99df-a29c4115ce41 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:10:00 compute-0 nova_compute[183177]: 2026-01-26 20:10:00.739 183181 DEBUG oslo_concurrency.lockutils [req-d7552371-00fe-42ea-a3c7-9e4ef862bff2 req-31156eac-7a9e-4bb5-99df-a29c4115ce41 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:10:00 compute-0 nova_compute[183177]: 2026-01-26 20:10:00.739 183181 DEBUG nova.compute.manager [req-d7552371-00fe-42ea-a3c7-9e4ef862bff2 req-31156eac-7a9e-4bb5-99df-a29c4115ce41 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] No waiting events found dispatching network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:10:00 compute-0 nova_compute[183177]: 2026-01-26 20:10:00.739 183181 WARNING nova.compute.manager [req-d7552371-00fe-42ea-a3c7-9e4ef862bff2 req-31156eac-7a9e-4bb5-99df-a29c4115ce41 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received unexpected event network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f for instance with vm_state active and task_state None.
Jan 26 20:10:00 compute-0 nova_compute[183177]: 2026-01-26 20:10:00.811 183181 DEBUG oslo_concurrency.lockutils [None req-97c0668b-b7aa-4443-8011-c9f13ca88b8d 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.929s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:10:01 compute-0 openstack_network_exporter[195363]: ERROR   20:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:10:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:10:01 compute-0 openstack_network_exporter[195363]: ERROR   20:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:10:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:10:03 compute-0 nova_compute[183177]: 2026-01-26 20:10:03.950 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:04 compute-0 nova_compute[183177]: 2026-01-26 20:10:04.663 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:10:05 compute-0 nova_compute[183177]: 2026-01-26 20:10:05.138 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:08 compute-0 nova_compute[183177]: 2026-01-26 20:10:08.954 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:10 compute-0 nova_compute[183177]: 2026-01-26 20:10:10.141 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:10 compute-0 ovn_controller[95396]: 2026-01-26T20:10:10Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e4:a4:a0 10.100.0.3
Jan 26 20:10:10 compute-0 ovn_controller[95396]: 2026-01-26T20:10:10Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e4:a4:a0 10.100.0.3
Jan 26 20:10:11 compute-0 nova_compute[183177]: 2026-01-26 20:10:11.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:10:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:12.161 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:10:12 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:12.161 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:10:12 compute-0 nova_compute[183177]: 2026-01-26 20:10:12.203 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:13 compute-0 nova_compute[183177]: 2026-01-26 20:10:13.992 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:15 compute-0 nova_compute[183177]: 2026-01-26 20:10:15.184 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:15 compute-0 sshd-session[216140]: Connection closed by authenticating user root 188.166.116.149 port 56090 [preauth]
Jan 26 20:10:16 compute-0 podman[216144]: 2026-01-26 20:10:16.340290853 +0000 UTC m=+0.067476587 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:10:16 compute-0 podman[216143]: 2026-01-26 20:10:16.347476816 +0000 UTC m=+0.086194721 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal)
Jan 26 20:10:16 compute-0 podman[216142]: 2026-01-26 20:10:16.426608936 +0000 UTC m=+0.170082819 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 20:10:17 compute-0 nova_compute[183177]: 2026-01-26 20:10:17.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:10:17 compute-0 nova_compute[183177]: 2026-01-26 20:10:17.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:10:17 compute-0 nova_compute[183177]: 2026-01-26 20:10:17.152 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:10:18 compute-0 nova_compute[183177]: 2026-01-26 20:10:18.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:10:18 compute-0 nova_compute[183177]: 2026-01-26 20:10:18.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:10:18 compute-0 sshd-session[216202]: Connection closed by authenticating user root 142.93.140.142 port 37430 [preauth]
Jan 26 20:10:18 compute-0 nova_compute[183177]: 2026-01-26 20:10:18.735 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:10:18 compute-0 nova_compute[183177]: 2026-01-26 20:10:18.735 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:10:18 compute-0 nova_compute[183177]: 2026-01-26 20:10:18.736 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:10:18 compute-0 nova_compute[183177]: 2026-01-26 20:10:18.736 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:10:18 compute-0 nova_compute[183177]: 2026-01-26 20:10:18.994 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:19 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:19.163 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:10:19 compute-0 nova_compute[183177]: 2026-01-26 20:10:19.793 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:10:19 compute-0 nova_compute[183177]: 2026-01-26 20:10:19.849 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:10:19 compute-0 nova_compute[183177]: 2026-01-26 20:10:19.850 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:10:19 compute-0 nova_compute[183177]: 2026-01-26 20:10:19.915 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:10:20 compute-0 nova_compute[183177]: 2026-01-26 20:10:20.083 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:10:20 compute-0 nova_compute[183177]: 2026-01-26 20:10:20.084 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:10:20 compute-0 nova_compute[183177]: 2026-01-26 20:10:20.103 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:10:20 compute-0 nova_compute[183177]: 2026-01-26 20:10:20.103 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5563MB free_disk=73.06147384643555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:10:20 compute-0 nova_compute[183177]: 2026-01-26 20:10:20.104 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:10:20 compute-0 nova_compute[183177]: 2026-01-26 20:10:20.104 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:10:20 compute-0 nova_compute[183177]: 2026-01-26 20:10:20.187 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:21 compute-0 nova_compute[183177]: 2026-01-26 20:10:21.230 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance b427e269-a2ae-4c99-b118-5a532d52b29d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 20:10:21 compute-0 nova_compute[183177]: 2026-01-26 20:10:21.231 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:10:21 compute-0 nova_compute[183177]: 2026-01-26 20:10:21.231 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:10:20 up  1:34,  0 user,  load average: 0.19, 0.25, 0.26\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_393f035f3d824babb9d76f6e83e4192b': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:10:21 compute-0 nova_compute[183177]: 2026-01-26 20:10:21.292 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:10:21 compute-0 nova_compute[183177]: 2026-01-26 20:10:21.801 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:10:22 compute-0 nova_compute[183177]: 2026-01-26 20:10:22.313 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:10:22 compute-0 nova_compute[183177]: 2026-01-26 20:10:22.313 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.209s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:10:22 compute-0 podman[216212]: 2026-01-26 20:10:22.33594936 +0000 UTC m=+0.076129300 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:10:23 compute-0 nova_compute[183177]: 2026-01-26 20:10:23.997 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:24.108 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:10:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:24.109 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:10:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:24.109 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:10:24 compute-0 nova_compute[183177]: 2026-01-26 20:10:24.313 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:10:25 compute-0 nova_compute[183177]: 2026-01-26 20:10:25.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:10:25 compute-0 nova_compute[183177]: 2026-01-26 20:10:25.233 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:29 compute-0 nova_compute[183177]: 2026-01-26 20:10:29.001 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:29 compute-0 nova_compute[183177]: 2026-01-26 20:10:29.148 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:10:29 compute-0 podman[192499]: time="2026-01-26T20:10:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:10:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:10:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:10:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:10:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2643 "" "Go-http-client/1.1"
Jan 26 20:10:30 compute-0 nova_compute[183177]: 2026-01-26 20:10:30.235 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:31 compute-0 openstack_network_exporter[195363]: ERROR   20:10:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:10:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:10:31 compute-0 openstack_network_exporter[195363]: ERROR   20:10:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:10:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:10:34 compute-0 nova_compute[183177]: 2026-01-26 20:10:34.035 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:35 compute-0 nova_compute[183177]: 2026-01-26 20:10:35.287 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:39 compute-0 nova_compute[183177]: 2026-01-26 20:10:39.038 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:40 compute-0 nova_compute[183177]: 2026-01-26 20:10:40.288 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:43 compute-0 nova_compute[183177]: 2026-01-26 20:10:43.220 183181 DEBUG nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Check if temp file /var/lib/nova/instances/tmp91qngkol exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 20:10:43 compute-0 nova_compute[183177]: 2026-01-26 20:10:43.225 183181 DEBUG nova.compute.manager [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp91qngkol',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b427e269-a2ae-4c99-b118-5a532d52b29d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 20:10:44 compute-0 nova_compute[183177]: 2026-01-26 20:10:44.080 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:45 compute-0 nova_compute[183177]: 2026-01-26 20:10:45.335 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:47 compute-0 podman[216239]: 2026-01-26 20:10:47.35668124 +0000 UTC m=+0.084196217 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 26 20:10:47 compute-0 podman[216238]: 2026-01-26 20:10:47.362847187 +0000 UTC m=+0.099384866 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 20:10:47 compute-0 podman[216237]: 2026-01-26 20:10:47.398946628 +0000 UTC m=+0.138182000 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 20:10:47 compute-0 nova_compute[183177]: 2026-01-26 20:10:47.804 183181 DEBUG oslo_concurrency.processutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:10:47 compute-0 nova_compute[183177]: 2026-01-26 20:10:47.890 183181 DEBUG oslo_concurrency.processutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:10:47 compute-0 nova_compute[183177]: 2026-01-26 20:10:47.891 183181 DEBUG oslo_concurrency.processutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:10:47 compute-0 nova_compute[183177]: 2026-01-26 20:10:47.979 183181 DEBUG oslo_concurrency.processutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:10:47 compute-0 nova_compute[183177]: 2026-01-26 20:10:47.981 183181 DEBUG nova.compute.manager [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Preparing to wait for external event network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:10:47 compute-0 nova_compute[183177]: 2026-01-26 20:10:47.982 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:10:47 compute-0 nova_compute[183177]: 2026-01-26 20:10:47.983 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:10:47 compute-0 nova_compute[183177]: 2026-01-26 20:10:47.983 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:10:49 compute-0 nova_compute[183177]: 2026-01-26 20:10:49.083 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:50 compute-0 nova_compute[183177]: 2026-01-26 20:10:50.370 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:51 compute-0 ovn_controller[95396]: 2026-01-26T20:10:51Z|00237|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 26 20:10:53 compute-0 sshd-session[216310]: Connection closed by authenticating user root 188.166.116.149 port 40454 [preauth]
Jan 26 20:10:53 compute-0 podman[216313]: 2026-01-26 20:10:53.338116308 +0000 UTC m=+0.082428010 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:10:53 compute-0 sshd-session[216312]: Connection closed by authenticating user root 142.93.140.142 port 55358 [preauth]
Jan 26 20:10:54 compute-0 nova_compute[183177]: 2026-01-26 20:10:54.067 183181 DEBUG nova.compute.manager [req-f666cf2d-af6f-4056-bb01-55151e7f3127 req-8ce5d43a-f6b6-4772-819c-7eae23ce4fdd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received event network-vif-unplugged-485296a7-5a9b-4358-a4ae-28d57a2e471f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:10:54 compute-0 nova_compute[183177]: 2026-01-26 20:10:54.068 183181 DEBUG oslo_concurrency.lockutils [req-f666cf2d-af6f-4056-bb01-55151e7f3127 req-8ce5d43a-f6b6-4772-819c-7eae23ce4fdd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:10:54 compute-0 nova_compute[183177]: 2026-01-26 20:10:54.068 183181 DEBUG oslo_concurrency.lockutils [req-f666cf2d-af6f-4056-bb01-55151e7f3127 req-8ce5d43a-f6b6-4772-819c-7eae23ce4fdd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:10:54 compute-0 nova_compute[183177]: 2026-01-26 20:10:54.068 183181 DEBUG oslo_concurrency.lockutils [req-f666cf2d-af6f-4056-bb01-55151e7f3127 req-8ce5d43a-f6b6-4772-819c-7eae23ce4fdd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:10:54 compute-0 nova_compute[183177]: 2026-01-26 20:10:54.069 183181 DEBUG nova.compute.manager [req-f666cf2d-af6f-4056-bb01-55151e7f3127 req-8ce5d43a-f6b6-4772-819c-7eae23ce4fdd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] No event matching network-vif-unplugged-485296a7-5a9b-4358-a4ae-28d57a2e471f in dict_keys([('network-vif-plugged', '485296a7-5a9b-4358-a4ae-28d57a2e471f')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 20:10:54 compute-0 nova_compute[183177]: 2026-01-26 20:10:54.069 183181 DEBUG nova.compute.manager [req-f666cf2d-af6f-4056-bb01-55151e7f3127 req-8ce5d43a-f6b6-4772-819c-7eae23ce4fdd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received event network-vif-unplugged-485296a7-5a9b-4358-a4ae-28d57a2e471f for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:10:54 compute-0 nova_compute[183177]: 2026-01-26 20:10:54.084 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:55 compute-0 nova_compute[183177]: 2026-01-26 20:10:55.400 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:55 compute-0 nova_compute[183177]: 2026-01-26 20:10:55.528 183181 INFO nova.compute.manager [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Took 7.54 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 26 20:10:56 compute-0 nova_compute[183177]: 2026-01-26 20:10:56.156 183181 DEBUG nova.compute.manager [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received event network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:10:56 compute-0 nova_compute[183177]: 2026-01-26 20:10:56.156 183181 DEBUG oslo_concurrency.lockutils [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:10:56 compute-0 nova_compute[183177]: 2026-01-26 20:10:56.157 183181 DEBUG oslo_concurrency.lockutils [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:10:56 compute-0 nova_compute[183177]: 2026-01-26 20:10:56.157 183181 DEBUG oslo_concurrency.lockutils [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:10:56 compute-0 nova_compute[183177]: 2026-01-26 20:10:56.157 183181 DEBUG nova.compute.manager [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Processing event network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:10:56 compute-0 nova_compute[183177]: 2026-01-26 20:10:56.157 183181 DEBUG nova.compute.manager [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received event network-changed-485296a7-5a9b-4358-a4ae-28d57a2e471f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:10:56 compute-0 nova_compute[183177]: 2026-01-26 20:10:56.158 183181 DEBUG nova.compute.manager [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Refreshing instance network info cache due to event network-changed-485296a7-5a9b-4358-a4ae-28d57a2e471f. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:10:56 compute-0 nova_compute[183177]: 2026-01-26 20:10:56.158 183181 DEBUG oslo_concurrency.lockutils [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-b427e269-a2ae-4c99-b118-5a532d52b29d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:10:56 compute-0 nova_compute[183177]: 2026-01-26 20:10:56.158 183181 DEBUG oslo_concurrency.lockutils [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-b427e269-a2ae-4c99-b118-5a532d52b29d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:10:56 compute-0 nova_compute[183177]: 2026-01-26 20:10:56.158 183181 DEBUG nova.network.neutron [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Refreshing network info cache for port 485296a7-5a9b-4358-a4ae-28d57a2e471f _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:10:56 compute-0 nova_compute[183177]: 2026-01-26 20:10:56.160 183181 DEBUG nova.compute.manager [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:10:56 compute-0 nova_compute[183177]: 2026-01-26 20:10:56.666 183181 WARNING neutronclient.v2_0.client [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:10:56 compute-0 nova_compute[183177]: 2026-01-26 20:10:56.674 183181 DEBUG nova.compute.manager [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp91qngkol',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b427e269-a2ae-4c99-b118-5a532d52b29d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(b528302d-1309-47bf-aec2-f50419dd72bf),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.182 183181 WARNING neutronclient.v2_0.client [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.203 183181 DEBUG nova.objects.instance [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid b427e269-a2ae-4c99-b118-5a532d52b29d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.203 183181 DEBUG nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.204 183181 DEBUG nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.205 183181 DEBUG nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.506 183181 DEBUG nova.network.neutron [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Updated VIF entry in instance network info cache for port 485296a7-5a9b-4358-a4ae-28d57a2e471f. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.507 183181 DEBUG nova.network.neutron [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Updating instance_info_cache with network_info: [{"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.707 183181 DEBUG nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.707 183181 DEBUG nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.718 183181 DEBUG nova.virt.libvirt.vif [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:09:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1031211870',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1031211870',id=30,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:09:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='393f035f3d824babb9d76f6e83e4192b',ramdisk_id='',reservation_id='r-otl3sowe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1136104294',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:09:59Z,user_data=None,user_id='7a3a0c805ad14e438b8e8a90e16d8d02',uuid=b427e269-a2ae-4c99-b118-5a532d52b29d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.719 183181 DEBUG nova.network.os_vif_util [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.722 183181 DEBUG nova.network.os_vif_util [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a4:a0,bridge_name='br-int',has_traffic_filtering=True,id=485296a7-5a9b-4358-a4ae-28d57a2e471f,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485296a7-5a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.723 183181 DEBUG nova.virt.libvirt.migration [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <mac address="fa:16:3e:e4:a4:a0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <model type="virtio"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <mtu size="1442"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <target dev="tap485296a7-5a"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]: </interface>
Jan 26 20:10:57 compute-0 nova_compute[183177]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.725 183181 DEBUG nova.virt.libvirt.migration [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <name>instance-0000001e</name>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <uuid>b427e269-a2ae-4c99-b118-5a532d52b29d</uuid>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1031211870</nova:name>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:09:54</nova:creationTime>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:10:57 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:10:57 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:user uuid="7a3a0c805ad14e438b8e8a90e16d8d02">tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin</nova:user>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:project uuid="393f035f3d824babb9d76f6e83e4192b">tempest-TestExecuteZoneMigrationStrategy-1136104294</nova:project>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:port uuid="485296a7-5a9b-4358-a4ae-28d57a2e471f">
Jan 26 20:10:57 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <system>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="serial">b427e269-a2ae-4c99-b118-5a532d52b29d</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="uuid">b427e269-a2ae-4c99-b118-5a532d52b29d</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </system>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <os>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </os>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <features>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </features>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk.config"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:e4:a4:a0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap485296a7-5a"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/console.log" append="off"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </target>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/console.log" append="off"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </console>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </input>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <video>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </video>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]: </domain>
Jan 26 20:10:57 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.726 183181 DEBUG nova.virt.libvirt.migration [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <name>instance-0000001e</name>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <uuid>b427e269-a2ae-4c99-b118-5a532d52b29d</uuid>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1031211870</nova:name>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:09:54</nova:creationTime>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:10:57 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:10:57 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:user uuid="7a3a0c805ad14e438b8e8a90e16d8d02">tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin</nova:user>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:project uuid="393f035f3d824babb9d76f6e83e4192b">tempest-TestExecuteZoneMigrationStrategy-1136104294</nova:project>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:port uuid="485296a7-5a9b-4358-a4ae-28d57a2e471f">
Jan 26 20:10:57 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <system>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="serial">b427e269-a2ae-4c99-b118-5a532d52b29d</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="uuid">b427e269-a2ae-4c99-b118-5a532d52b29d</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </system>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <os>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </os>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <features>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </features>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk.config"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:e4:a4:a0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap485296a7-5a"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/console.log" append="off"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </target>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/console.log" append="off"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </console>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </input>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <video>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </video>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]: </domain>
Jan 26 20:10:57 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.727 183181 DEBUG nova.virt.libvirt.migration [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <name>instance-0000001e</name>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <uuid>b427e269-a2ae-4c99-b118-5a532d52b29d</uuid>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1031211870</nova:name>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:09:54</nova:creationTime>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:10:57 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:10:57 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:user uuid="7a3a0c805ad14e438b8e8a90e16d8d02">tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin</nova:user>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:project uuid="393f035f3d824babb9d76f6e83e4192b">tempest-TestExecuteZoneMigrationStrategy-1136104294</nova:project>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <nova:port uuid="485296a7-5a9b-4358-a4ae-28d57a2e471f">
Jan 26 20:10:57 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <system>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="serial">b427e269-a2ae-4c99-b118-5a532d52b29d</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="uuid">b427e269-a2ae-4c99-b118-5a532d52b29d</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </system>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <os>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </os>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <features>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </features>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/disk.config"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:e4:a4:a0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap485296a7-5a"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/console.log" append="off"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:10:57 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       </target>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d/console.log" append="off"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </console>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </input>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <video>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </video>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:10:57 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:10:57 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:10:57 compute-0 nova_compute[183177]: </domain>
Jan 26 20:10:57 compute-0 nova_compute[183177]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 20:10:57 compute-0 nova_compute[183177]: 2026-01-26 20:10:57.728 183181 DEBUG nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 20:10:58 compute-0 nova_compute[183177]: 2026-01-26 20:10:58.014 183181 DEBUG oslo_concurrency.lockutils [req-4f35eab0-d3e3-44b2-a93b-d4e2b9a8ad35 req-03df3fc0-9500-4c8b-85b8-24e1d5f9dbfd 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-b427e269-a2ae-4c99-b118-5a532d52b29d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:10:58 compute-0 nova_compute[183177]: 2026-01-26 20:10:58.210 183181 DEBUG nova.virt.libvirt.migration [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:10:58 compute-0 nova_compute[183177]: 2026-01-26 20:10:58.211 183181 INFO nova.virt.libvirt.migration [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.086 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.235 183181 INFO nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 20:10:59 compute-0 kernel: tap485296a7-5a (unregistering): left promiscuous mode
Jan 26 20:10:59 compute-0 NetworkManager[55489]: <info>  [1769458259.4665] device (tap485296a7-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 20:10:59 compute-0 ovn_controller[95396]: 2026-01-26T20:10:59Z|00238|binding|INFO|Releasing lport 485296a7-5a9b-4358-a4ae-28d57a2e471f from this chassis (sb_readonly=0)
Jan 26 20:10:59 compute-0 ovn_controller[95396]: 2026-01-26T20:10:59Z|00239|binding|INFO|Setting lport 485296a7-5a9b-4358-a4ae-28d57a2e471f down in Southbound
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.471 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:59 compute-0 ovn_controller[95396]: 2026-01-26T20:10:59Z|00240|binding|INFO|Removing iface tap485296a7-5a ovn-installed in OVS
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.474 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.502 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:a4:a0 10.100.0.3'], port_security=['fa:16:3e:e4:a4:a0 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c86a094a-ada8-46ad-9c23-c857e9a7b834'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b427e269-a2ae-4c99-b118-5a532d52b29d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '393f035f3d824babb9d76f6e83e4192b', 'neutron:revision_number': '10', 'neutron:security_group_ids': '00852c1f-7776-4984-b01a-e3f5f5d3f1d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03a6e965-0f13-4adb-a0dd-8b518d1d2445, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=485296a7-5a9b-4358-a4ae-28d57a2e471f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.503 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 485296a7-5a9b-4358-a4ae-28d57a2e471f in datapath bbde741b-e853-4ad9-b0df-87ed33f347f8 unbound from our chassis
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.505 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bbde741b-e853-4ad9-b0df-87ed33f347f8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.507 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[ccec1103-e398-469b-8c83-8c93c49d050c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.508 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8 namespace which is not needed anymore
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.509 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:59 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 26 20:10:59 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001e.scope: Consumed 15.687s CPU time.
Jan 26 20:10:59 compute-0 systemd-machined[154465]: Machine qemu-23-instance-0000001e terminated.
Jan 26 20:10:59 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216094]: [NOTICE]   (216098) : haproxy version is 3.0.5-8e879a5
Jan 26 20:10:59 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216094]: [NOTICE]   (216098) : path to executable is /usr/sbin/haproxy
Jan 26 20:10:59 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216094]: [WARNING]  (216098) : Exiting Master process...
Jan 26 20:10:59 compute-0 podman[216376]: 2026-01-26 20:10:59.6388729 +0000 UTC m=+0.040145891 container kill 35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 20:10:59 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216094]: [ALERT]    (216098) : Current worker (216100) exited with code 143 (Terminated)
Jan 26 20:10:59 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216094]: [WARNING]  (216098) : All workers exited. Exiting... (0)
Jan 26 20:10:59 compute-0 systemd[1]: libpod-35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f.scope: Deactivated successfully.
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.670 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.677 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:59 compute-0 podman[216392]: 2026-01-26 20:10:59.724270299 +0000 UTC m=+0.048871447 container died 35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120)
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.726 183181 DEBUG nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.726 183181 DEBUG nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.727 183181 DEBUG nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.737 183181 DEBUG nova.virt.libvirt.guest [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'b427e269-a2ae-4c99-b118-5a532d52b29d' (instance-0000001e) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.738 183181 INFO nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Migration operation has completed
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.738 183181 INFO nova.compute.manager [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] _post_live_migration() is started..
Jan 26 20:10:59 compute-0 podman[192499]: time="2026-01-26T20:10:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.761 183181 WARNING neutronclient.v2_0.client [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.761 183181 WARNING neutronclient.v2_0.client [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:10:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f-userdata-shm.mount: Deactivated successfully.
Jan 26 20:10:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-cb82d917dda3d8f0c19cb564dc4407919421071fe5c5cb22a1c518c404de57eb-merged.mount: Deactivated successfully.
Jan 26 20:10:59 compute-0 podman[216392]: 2026-01-26 20:10:59.774902441 +0000 UTC m=+0.099503549 container cleanup 35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 20:10:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:10:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16566 "" "Go-http-client/1.1"
Jan 26 20:10:59 compute-0 systemd[1]: libpod-conmon-35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f.scope: Deactivated successfully.
Jan 26 20:10:59 compute-0 podman[216395]: 2026-01-26 20:10:59.79898082 +0000 UTC m=+0.115428148 container remove 35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 26 20:10:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:10:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.807 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[4b921707-d703-419a-bb89-8a8f2408ff79]: (4, ("Mon Jan 26 08:10:59 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8 (35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f)\n35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f\nMon Jan 26 08:10:59 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8 (35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f)\n35cd1998b3889cb2a4b2c649ebe88da39761a2ff283e277cd2350d8c4a54a72f\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.809 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[185c426d-dfa4-46f6-bd49-3d023fce4e80]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.810 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.810 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[dc09c5e3-73ff-4f38-9fa7-9fd973abe280]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.811 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbbde741b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.813 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:59 compute-0 kernel: tapbbde741b-e0: left promiscuous mode
Jan 26 20:10:59 compute-0 nova_compute[183177]: 2026-01-26 20:10:59.844 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.847 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[81536c8f-e9f3-47a9-9bd5-743a19239cad]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.863 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[97b9a01c-665b-44ce-ad5a-afbebce92eca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.865 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[31a6135b-c83b-4313-b541-4d607f84fec0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.883 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8f2197fe-07c3-4cc2-89ab-67375877c486]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565658, 'reachable_time': 15493, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216435, 'error': None, 'target': 'ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.886 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 20:10:59 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:10:59.886 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[9588decc-c37a-420c-825b-2437e7209537]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:10:59 compute-0 systemd[1]: run-netns-ovnmeta\x2dbbde741b\x2de853\x2d4ad9\x2db0df\x2d87ed33f347f8.mount: Deactivated successfully.
Jan 26 20:11:00 compute-0 nova_compute[183177]: 2026-01-26 20:11:00.431 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:01 compute-0 openstack_network_exporter[195363]: ERROR   20:11:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:11:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:11:01 compute-0 openstack_network_exporter[195363]: ERROR   20:11:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:11:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:11:01 compute-0 nova_compute[183177]: 2026-01-26 20:11:01.665 183181 DEBUG nova.compute.manager [req-6a7cecf1-48a7-4e09-97eb-9e1fa04b8642 req-2b7239d8-fe67-4ac2-9e7d-487d404a535e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received event network-vif-unplugged-485296a7-5a9b-4358-a4ae-28d57a2e471f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:11:01 compute-0 nova_compute[183177]: 2026-01-26 20:11:01.666 183181 DEBUG oslo_concurrency.lockutils [req-6a7cecf1-48a7-4e09-97eb-9e1fa04b8642 req-2b7239d8-fe67-4ac2-9e7d-487d404a535e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:01 compute-0 nova_compute[183177]: 2026-01-26 20:11:01.667 183181 DEBUG oslo_concurrency.lockutils [req-6a7cecf1-48a7-4e09-97eb-9e1fa04b8642 req-2b7239d8-fe67-4ac2-9e7d-487d404a535e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:01 compute-0 nova_compute[183177]: 2026-01-26 20:11:01.667 183181 DEBUG oslo_concurrency.lockutils [req-6a7cecf1-48a7-4e09-97eb-9e1fa04b8642 req-2b7239d8-fe67-4ac2-9e7d-487d404a535e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:01 compute-0 nova_compute[183177]: 2026-01-26 20:11:01.668 183181 DEBUG nova.compute.manager [req-6a7cecf1-48a7-4e09-97eb-9e1fa04b8642 req-2b7239d8-fe67-4ac2-9e7d-487d404a535e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] No waiting events found dispatching network-vif-unplugged-485296a7-5a9b-4358-a4ae-28d57a2e471f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:11:01 compute-0 nova_compute[183177]: 2026-01-26 20:11:01.668 183181 DEBUG nova.compute.manager [req-6a7cecf1-48a7-4e09-97eb-9e1fa04b8642 req-2b7239d8-fe67-4ac2-9e7d-487d404a535e 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received event network-vif-unplugged-485296a7-5a9b-4358-a4ae-28d57a2e471f for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.287 183181 DEBUG nova.network.neutron [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Activated binding for port 485296a7-5a9b-4358-a4ae-28d57a2e471f and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.288 183181 DEBUG nova.compute.manager [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.289 183181 DEBUG nova.virt.libvirt.vif [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:09:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1031211870',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1031211870',id=30,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:09:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='393f035f3d824babb9d76f6e83e4192b',ramdisk_id='',reservation_id='r-otl3sowe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1136104294',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:10:37Z,user_data=None,user_id='7a3a0c805ad14e438b8e8a90e16d8d02',uuid=b427e269-a2ae-4c99-b118-5a532d52b29d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.289 183181 DEBUG nova.network.os_vif_util [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "address": "fa:16:3e:e4:a4:a0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap485296a7-5a", "ovs_interfaceid": "485296a7-5a9b-4358-a4ae-28d57a2e471f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.290 183181 DEBUG nova.network.os_vif_util [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a4:a0,bridge_name='br-int',has_traffic_filtering=True,id=485296a7-5a9b-4358-a4ae-28d57a2e471f,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485296a7-5a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.290 183181 DEBUG os_vif [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a4:a0,bridge_name='br-int',has_traffic_filtering=True,id=485296a7-5a9b-4358-a4ae-28d57a2e471f,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485296a7-5a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.292 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.292 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap485296a7-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.294 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.296 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.296 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.297 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.298 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e31d578b-653d-4752-90bd-fcb4d7172e50) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.298 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.300 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.302 183181 INFO os_vif [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:a4:a0,bridge_name='br-int',has_traffic_filtering=True,id=485296a7-5a9b-4358-a4ae-28d57a2e471f,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap485296a7-5a')
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.302 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.303 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.303 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.303 183181 DEBUG nova.compute.manager [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.304 183181 INFO nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Deleting instance files /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d_del
Jan 26 20:11:02 compute-0 nova_compute[183177]: 2026-01-26 20:11:02.305 183181 INFO nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Deletion of /var/lib/nova/instances/b427e269-a2ae-4c99-b118-5a532d52b29d_del complete
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.725 183181 DEBUG nova.compute.manager [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received event network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.726 183181 DEBUG oslo_concurrency.lockutils [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.726 183181 DEBUG oslo_concurrency.lockutils [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.727 183181 DEBUG oslo_concurrency.lockutils [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.727 183181 DEBUG nova.compute.manager [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] No waiting events found dispatching network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.727 183181 WARNING nova.compute.manager [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received unexpected event network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f for instance with vm_state active and task_state migrating.
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.728 183181 DEBUG nova.compute.manager [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received event network-vif-unplugged-485296a7-5a9b-4358-a4ae-28d57a2e471f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.728 183181 DEBUG oslo_concurrency.lockutils [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.729 183181 DEBUG oslo_concurrency.lockutils [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.729 183181 DEBUG oslo_concurrency.lockutils [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.729 183181 DEBUG nova.compute.manager [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] No waiting events found dispatching network-vif-unplugged-485296a7-5a9b-4358-a4ae-28d57a2e471f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.730 183181 DEBUG nova.compute.manager [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received event network-vif-unplugged-485296a7-5a9b-4358-a4ae-28d57a2e471f for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.730 183181 DEBUG nova.compute.manager [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received event network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.730 183181 DEBUG oslo_concurrency.lockutils [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.731 183181 DEBUG oslo_concurrency.lockutils [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.731 183181 DEBUG oslo_concurrency.lockutils [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.732 183181 DEBUG nova.compute.manager [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] No waiting events found dispatching network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:11:03 compute-0 nova_compute[183177]: 2026-01-26 20:11:03.732 183181 WARNING nova.compute.manager [req-4581b84b-a07f-41f7-aff8-515541bf9201 req-a85849f8-eabe-4e06-98dc-fc4db4e06b27 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Received unexpected event network-vif-plugged-485296a7-5a9b-4358-a4ae-28d57a2e471f for instance with vm_state active and task_state migrating.
Jan 26 20:11:04 compute-0 nova_compute[183177]: 2026-01-26 20:11:04.088 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:04 compute-0 nova_compute[183177]: 2026-01-26 20:11:04.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:11:07 compute-0 nova_compute[183177]: 2026-01-26 20:11:07.299 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:09 compute-0 nova_compute[183177]: 2026-01-26 20:11:09.090 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:12 compute-0 nova_compute[183177]: 2026-01-26 20:11:12.301 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:13 compute-0 nova_compute[183177]: 2026-01-26 20:11:13.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:11:14 compute-0 nova_compute[183177]: 2026-01-26 20:11:14.131 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:14 compute-0 nova_compute[183177]: 2026-01-26 20:11:14.849 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:14 compute-0 nova_compute[183177]: 2026-01-26 20:11:14.850 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:14 compute-0 nova_compute[183177]: 2026-01-26 20:11:14.850 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "b427e269-a2ae-4c99-b118-5a532d52b29d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:15 compute-0 nova_compute[183177]: 2026-01-26 20:11:15.363 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:15 compute-0 nova_compute[183177]: 2026-01-26 20:11:15.364 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:15 compute-0 nova_compute[183177]: 2026-01-26 20:11:15.364 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:15 compute-0 nova_compute[183177]: 2026-01-26 20:11:15.364 183181 DEBUG nova.compute.resource_tracker [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:11:15 compute-0 nova_compute[183177]: 2026-01-26 20:11:15.544 183181 WARNING nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:11:15 compute-0 nova_compute[183177]: 2026-01-26 20:11:15.546 183181 DEBUG oslo_concurrency.processutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:11:15 compute-0 nova_compute[183177]: 2026-01-26 20:11:15.588 183181 DEBUG oslo_concurrency.processutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:11:15 compute-0 nova_compute[183177]: 2026-01-26 20:11:15.589 183181 DEBUG nova.compute.resource_tracker [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5727MB free_disk=73.09014129638672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:11:15 compute-0 nova_compute[183177]: 2026-01-26 20:11:15.589 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:15 compute-0 nova_compute[183177]: 2026-01-26 20:11:15.589 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:16 compute-0 nova_compute[183177]: 2026-01-26 20:11:16.614 183181 DEBUG nova.compute.resource_tracker [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance b427e269-a2ae-4c99-b118-5a532d52b29d refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 20:11:17 compute-0 nova_compute[183177]: 2026-01-26 20:11:17.152 183181 DEBUG nova.compute.resource_tracker [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 20:11:17 compute-0 nova_compute[183177]: 2026-01-26 20:11:17.159 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:11:17 compute-0 nova_compute[183177]: 2026-01-26 20:11:17.160 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:11:17 compute-0 nova_compute[183177]: 2026-01-26 20:11:17.194 183181 DEBUG nova.compute.resource_tracker [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration b528302d-1309-47bf-aec2-f50419dd72bf is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:11:17 compute-0 nova_compute[183177]: 2026-01-26 20:11:17.194 183181 DEBUG nova.compute.resource_tracker [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:11:17 compute-0 nova_compute[183177]: 2026-01-26 20:11:17.195 183181 DEBUG nova.compute.resource_tracker [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:11:15 up  1:35,  0 user,  load average: 0.32, 0.28, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:11:17 compute-0 nova_compute[183177]: 2026-01-26 20:11:17.235 183181 DEBUG nova.compute.provider_tree [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:11:17 compute-0 nova_compute[183177]: 2026-01-26 20:11:17.304 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:17 compute-0 nova_compute[183177]: 2026-01-26 20:11:17.745 183181 DEBUG nova.scheduler.client.report [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.150 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.259 183181 DEBUG nova.compute.resource_tracker [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.260 183181 DEBUG oslo_concurrency.lockutils [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.671s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.298 183181 INFO nova.compute.manager [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 26 20:11:18 compute-0 podman[216440]: 2026-01-26 20:11:18.33887633 +0000 UTC m=+0.080105607 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Jan 26 20:11:18 compute-0 podman[216439]: 2026-01-26 20:11:18.347417159 +0000 UTC m=+0.084442013 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public)
Jan 26 20:11:18 compute-0 podman[216438]: 2026-01-26 20:11:18.379265307 +0000 UTC m=+0.120458823 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.677 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.678 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.678 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.678 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.891 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.893 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.925 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.926 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5732MB free_disk=73.09014129638672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.926 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.926 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:18 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:11:18.952 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:11:18 compute-0 nova_compute[183177]: 2026-01-26 20:11:18.953 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:18 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:11:18.954 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:11:19 compute-0 nova_compute[183177]: 2026-01-26 20:11:19.166 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:19 compute-0 nova_compute[183177]: 2026-01-26 20:11:19.407 183181 INFO nova.scheduler.client.report [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration b528302d-1309-47bf-aec2-f50419dd72bf
Jan 26 20:11:19 compute-0 nova_compute[183177]: 2026-01-26 20:11:19.408 183181 DEBUG nova.virt.libvirt.driver [None req-9a2d1ee9-a1b8-4be8-a0ce-2ab4b794bfe1 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: b427e269-a2ae-4c99-b118-5a532d52b29d] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 20:11:19 compute-0 nova_compute[183177]: 2026-01-26 20:11:19.974 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:11:19 compute-0 nova_compute[183177]: 2026-01-26 20:11:19.974 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:11:18 up  1:35,  0 user,  load average: 0.29, 0.28, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:11:20 compute-0 nova_compute[183177]: 2026-01-26 20:11:20.001 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:11:20 compute-0 nova_compute[183177]: 2026-01-26 20:11:20.512 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:11:21 compute-0 nova_compute[183177]: 2026-01-26 20:11:21.033 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:11:21 compute-0 nova_compute[183177]: 2026-01-26 20:11:21.033 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:22 compute-0 nova_compute[183177]: 2026-01-26 20:11:22.307 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:23 compute-0 nova_compute[183177]: 2026-01-26 20:11:23.034 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:11:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:11:24.110 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:11:24.110 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:11:24.110 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:24 compute-0 nova_compute[183177]: 2026-01-26 20:11:24.167 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:24 compute-0 podman[216505]: 2026-01-26 20:11:24.313132043 +0000 UTC m=+0.065886504 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:11:26 compute-0 nova_compute[183177]: 2026-01-26 20:11:26.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:11:26 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:11:26.955 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:11:27 compute-0 nova_compute[183177]: 2026-01-26 20:11:27.365 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:29 compute-0 nova_compute[183177]: 2026-01-26 20:11:29.169 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:29 compute-0 podman[192499]: time="2026-01-26T20:11:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:11:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:11:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:11:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:11:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Jan 26 20:11:30 compute-0 sshd-session[216529]: Connection closed by authenticating user root 142.93.140.142 port 35748 [preauth]
Jan 26 20:11:31 compute-0 openstack_network_exporter[195363]: ERROR   20:11:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:11:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:11:31 compute-0 openstack_network_exporter[195363]: ERROR   20:11:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:11:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:11:32 compute-0 nova_compute[183177]: 2026-01-26 20:11:32.420 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:34 compute-0 nova_compute[183177]: 2026-01-26 20:11:34.171 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:34 compute-0 sshd-session[216531]: Connection closed by authenticating user root 188.166.116.149 port 39098 [preauth]
Jan 26 20:11:37 compute-0 nova_compute[183177]: 2026-01-26 20:11:37.421 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:39 compute-0 nova_compute[183177]: 2026-01-26 20:11:39.225 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:42 compute-0 nova_compute[183177]: 2026-01-26 20:11:42.424 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:44 compute-0 nova_compute[183177]: 2026-01-26 20:11:44.276 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:47 compute-0 nova_compute[183177]: 2026-01-26 20:11:47.425 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:49 compute-0 nova_compute[183177]: 2026-01-26 20:11:49.306 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:49 compute-0 nova_compute[183177]: 2026-01-26 20:11:49.329 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:49 compute-0 nova_compute[183177]: 2026-01-26 20:11:49.329 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:49 compute-0 podman[216535]: 2026-01-26 20:11:49.342839196 +0000 UTC m=+0.082469674 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 26 20:11:49 compute-0 podman[216534]: 2026-01-26 20:11:49.370028429 +0000 UTC m=+0.123543150 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 26 20:11:49 compute-0 podman[216533]: 2026-01-26 20:11:49.398834885 +0000 UTC m=+0.144991128 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 26 20:11:49 compute-0 nova_compute[183177]: 2026-01-26 20:11:49.835 183181 DEBUG nova.compute.manager [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 20:11:50 compute-0 nova_compute[183177]: 2026-01-26 20:11:50.403 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:50 compute-0 nova_compute[183177]: 2026-01-26 20:11:50.404 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:50 compute-0 nova_compute[183177]: 2026-01-26 20:11:50.415 183181 DEBUG nova.virt.hardware [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 20:11:50 compute-0 nova_compute[183177]: 2026-01-26 20:11:50.416 183181 INFO nova.compute.claims [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Claim successful on node compute-0.ctlplane.example.com
Jan 26 20:11:51 compute-0 nova_compute[183177]: 2026-01-26 20:11:51.490 183181 DEBUG nova.compute.provider_tree [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:11:51 compute-0 nova_compute[183177]: 2026-01-26 20:11:51.999 183181 DEBUG nova.scheduler.client.report [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:11:52 compute-0 nova_compute[183177]: 2026-01-26 20:11:52.427 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:52 compute-0 nova_compute[183177]: 2026-01-26 20:11:52.509 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.105s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:52 compute-0 nova_compute[183177]: 2026-01-26 20:11:52.510 183181 DEBUG nova.compute.manager [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 20:11:53 compute-0 nova_compute[183177]: 2026-01-26 20:11:53.023 183181 DEBUG nova.compute.manager [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 20:11:53 compute-0 nova_compute[183177]: 2026-01-26 20:11:53.024 183181 DEBUG nova.network.neutron [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 20:11:53 compute-0 nova_compute[183177]: 2026-01-26 20:11:53.025 183181 WARNING neutronclient.v2_0.client [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:11:53 compute-0 nova_compute[183177]: 2026-01-26 20:11:53.026 183181 WARNING neutronclient.v2_0.client [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:11:53 compute-0 nova_compute[183177]: 2026-01-26 20:11:53.536 183181 INFO nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 20:11:54 compute-0 nova_compute[183177]: 2026-01-26 20:11:54.055 183181 DEBUG nova.compute.manager [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 20:11:54 compute-0 nova_compute[183177]: 2026-01-26 20:11:54.309 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:54 compute-0 nova_compute[183177]: 2026-01-26 20:11:54.610 183181 DEBUG nova.network.neutron [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Successfully created port: a3ec3678-3134-4696-9139-0330b96780e6 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.075 183181 DEBUG nova.compute.manager [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.077 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.077 183181 INFO nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Creating image(s)
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.078 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.079 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.080 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.081 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.087 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.089 183181 DEBUG oslo_concurrency.processutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.158 183181 DEBUG oslo_concurrency.processutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.160 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.161 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.162 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.168 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.168 183181 DEBUG oslo_concurrency.processutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.249 183181 DEBUG oslo_concurrency.processutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.251 183181 DEBUG oslo_concurrency.processutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.306 183181 DEBUG oslo_concurrency.processutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.306 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.307 183181 DEBUG oslo_concurrency.processutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:11:55 compute-0 podman[216601]: 2026-01-26 20:11:55.344471438 +0000 UTC m=+0.086977235 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.393 183181 DEBUG oslo_concurrency.processutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.394 183181 DEBUG nova.virt.disk.api [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Checking if we can resize image /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.395 183181 DEBUG oslo_concurrency.processutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.479 183181 DEBUG oslo_concurrency.processutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.481 183181 DEBUG nova.virt.disk.api [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Cannot resize image /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.481 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.482 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Ensure instance console log exists: /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.482 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.483 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.483 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:11:55 compute-0 nova_compute[183177]: 2026-01-26 20:11:55.848 183181 DEBUG nova.network.neutron [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Successfully updated port: a3ec3678-3134-4696-9139-0330b96780e6 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 20:11:56 compute-0 nova_compute[183177]: 2026-01-26 20:11:56.251 183181 DEBUG nova.compute.manager [req-abf2e253-d6cb-4ffc-ad23-f1064491ff3d req-7d499cf1-7095-459a-83e5-e7a1f52c59a8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-changed-a3ec3678-3134-4696-9139-0330b96780e6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:11:56 compute-0 nova_compute[183177]: 2026-01-26 20:11:56.252 183181 DEBUG nova.compute.manager [req-abf2e253-d6cb-4ffc-ad23-f1064491ff3d req-7d499cf1-7095-459a-83e5-e7a1f52c59a8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Refreshing instance network info cache due to event network-changed-a3ec3678-3134-4696-9139-0330b96780e6. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:11:56 compute-0 nova_compute[183177]: 2026-01-26 20:11:56.252 183181 DEBUG oslo_concurrency.lockutils [req-abf2e253-d6cb-4ffc-ad23-f1064491ff3d req-7d499cf1-7095-459a-83e5-e7a1f52c59a8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-78b8c97e-5e87-4f6c-9031-926c30492876" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:11:56 compute-0 nova_compute[183177]: 2026-01-26 20:11:56.252 183181 DEBUG oslo_concurrency.lockutils [req-abf2e253-d6cb-4ffc-ad23-f1064491ff3d req-7d499cf1-7095-459a-83e5-e7a1f52c59a8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-78b8c97e-5e87-4f6c-9031-926c30492876" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:11:56 compute-0 nova_compute[183177]: 2026-01-26 20:11:56.252 183181 DEBUG nova.network.neutron [req-abf2e253-d6cb-4ffc-ad23-f1064491ff3d req-7d499cf1-7095-459a-83e5-e7a1f52c59a8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Refreshing network info cache for port a3ec3678-3134-4696-9139-0330b96780e6 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:11:56 compute-0 nova_compute[183177]: 2026-01-26 20:11:56.355 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "refresh_cache-78b8c97e-5e87-4f6c-9031-926c30492876" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:11:56 compute-0 nova_compute[183177]: 2026-01-26 20:11:56.757 183181 WARNING neutronclient.v2_0.client [req-abf2e253-d6cb-4ffc-ad23-f1064491ff3d req-7d499cf1-7095-459a-83e5-e7a1f52c59a8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:11:56 compute-0 nova_compute[183177]: 2026-01-26 20:11:56.857 183181 DEBUG nova.network.neutron [req-abf2e253-d6cb-4ffc-ad23-f1064491ff3d req-7d499cf1-7095-459a-83e5-e7a1f52c59a8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:11:57 compute-0 nova_compute[183177]: 2026-01-26 20:11:57.065 183181 DEBUG nova.network.neutron [req-abf2e253-d6cb-4ffc-ad23-f1064491ff3d req-7d499cf1-7095-459a-83e5-e7a1f52c59a8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:11:57 compute-0 nova_compute[183177]: 2026-01-26 20:11:57.430 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:57 compute-0 nova_compute[183177]: 2026-01-26 20:11:57.571 183181 DEBUG oslo_concurrency.lockutils [req-abf2e253-d6cb-4ffc-ad23-f1064491ff3d req-7d499cf1-7095-459a-83e5-e7a1f52c59a8 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-78b8c97e-5e87-4f6c-9031-926c30492876" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:11:57 compute-0 nova_compute[183177]: 2026-01-26 20:11:57.572 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquired lock "refresh_cache-78b8c97e-5e87-4f6c-9031-926c30492876" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:11:57 compute-0 nova_compute[183177]: 2026-01-26 20:11:57.572 183181 DEBUG nova.network.neutron [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 20:11:58 compute-0 nova_compute[183177]: 2026-01-26 20:11:58.555 183181 DEBUG nova.network.neutron [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:11:58 compute-0 nova_compute[183177]: 2026-01-26 20:11:58.765 183181 WARNING neutronclient.v2_0.client [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:11:58 compute-0 nova_compute[183177]: 2026-01-26 20:11:58.950 183181 DEBUG nova.network.neutron [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Updating instance_info_cache with network_info: [{"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.351 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.457 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Releasing lock "refresh_cache-78b8c97e-5e87-4f6c-9031-926c30492876" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.458 183181 DEBUG nova.compute.manager [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Instance network_info: |[{"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.463 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Start _get_guest_xml network_info=[{"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.468 183181 WARNING nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.471 183181 DEBUG nova.virt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-702608838', uuid='78b8c97e-5e87-4f6c-9031-926c30492876'), owner=OwnerMeta(userid='7a3a0c805ad14e438b8e8a90e16d8d02', username='tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin', projectid='393f035f3d824babb9d76f6e83e4192b', projectname='tempest-TestExecuteZoneMigrationStrategy-1136104294'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769458319.4711719) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.475 183181 DEBUG nova.virt.libvirt.host [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.476 183181 DEBUG nova.virt.libvirt.host [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.481 183181 DEBUG nova.virt.libvirt.host [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.482 183181 DEBUG nova.virt.libvirt.host [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.484 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.484 183181 DEBUG nova.virt.hardware [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.485 183181 DEBUG nova.virt.hardware [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.485 183181 DEBUG nova.virt.hardware [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.486 183181 DEBUG nova.virt.hardware [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.486 183181 DEBUG nova.virt.hardware [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.486 183181 DEBUG nova.virt.hardware [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.487 183181 DEBUG nova.virt.hardware [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.487 183181 DEBUG nova.virt.hardware [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.487 183181 DEBUG nova.virt.hardware [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.487 183181 DEBUG nova.virt.hardware [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.488 183181 DEBUG nova.virt.hardware [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.493 183181 DEBUG nova.virt.libvirt.vif [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:11:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-702608838',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-702608838',id=32,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='393f035f3d824babb9d76f6e83e4192b',ramdisk_id='',reservation_id='r-yiggrdr3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1136104294',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:11:54Z,user_data=None,user_id='7a3a0c805ad14e438b8e8a90e16d8d02',uuid=78b8c97e-5e87-4f6c-9031-926c30492876,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.494 183181 DEBUG nova.network.os_vif_util [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Converting VIF {"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.494 183181 DEBUG nova.network.os_vif_util [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:8a:b0,bridge_name='br-int',has_traffic_filtering=True,id=a3ec3678-3134-4696-9139-0330b96780e6,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3ec3678-31') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:11:59 compute-0 nova_compute[183177]: 2026-01-26 20:11:59.495 183181 DEBUG nova.objects.instance [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lazy-loading 'pci_devices' on Instance uuid 78b8c97e-5e87-4f6c-9031-926c30492876 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:11:59 compute-0 podman[192499]: time="2026-01-26T20:11:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:11:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:11:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:11:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:11:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.009 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] End _get_guest_xml xml=<domain type="kvm">
Jan 26 20:12:00 compute-0 nova_compute[183177]:   <uuid>78b8c97e-5e87-4f6c-9031-926c30492876</uuid>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   <name>instance-00000020</name>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-702608838</nova:name>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:11:59</nova:creationTime>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:12:00 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:12:00 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:user uuid="7a3a0c805ad14e438b8e8a90e16d8d02">tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin</nova:user>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:project uuid="393f035f3d824babb9d76f6e83e4192b">tempest-TestExecuteZoneMigrationStrategy-1136104294</nova:project>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         <nova:port uuid="a3ec3678-3134-4696-9139-0330b96780e6">
Jan 26 20:12:00 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <system>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <entry name="serial">78b8c97e-5e87-4f6c-9031-926c30492876</entry>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <entry name="uuid">78b8c97e-5e87-4f6c-9031-926c30492876</entry>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     </system>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   <os>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   </os>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   <features>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   </features>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk.config"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:e0:8a:b0"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <target dev="tapa3ec3678-31"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     </interface>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/console.log" append="off"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <video>
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     </video>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:12:00 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:12:00 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:12:00 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:12:00 compute-0 nova_compute[183177]: </domain>
Jan 26 20:12:00 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.011 183181 DEBUG nova.compute.manager [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Preparing to wait for external event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.011 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.011 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.012 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.012 183181 DEBUG nova.virt.libvirt.vif [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:11:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-702608838',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-702608838',id=32,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='393f035f3d824babb9d76f6e83e4192b',ramdisk_id='',reservation_id='r-yiggrdr3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1136104294',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:11:54Z,user_data=None,user_id='7a3a0c805ad14e438b8e8a90e16d8d02',uuid=78b8c97e-5e87-4f6c-9031-926c30492876,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.013 183181 DEBUG nova.network.os_vif_util [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Converting VIF {"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.013 183181 DEBUG nova.network.os_vif_util [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:8a:b0,bridge_name='br-int',has_traffic_filtering=True,id=a3ec3678-3134-4696-9139-0330b96780e6,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3ec3678-31') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.014 183181 DEBUG os_vif [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:8a:b0,bridge_name='br-int',has_traffic_filtering=True,id=a3ec3678-3134-4696-9139-0330b96780e6,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3ec3678-31') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.014 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.015 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.015 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.016 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.016 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c8806906-f3a8-566e-afc4-ee51e6184891', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.017 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.019 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.020 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.022 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.022 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3ec3678-31, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.023 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa3ec3678-31, col_values=(('qos', UUID('29105b58-9855-4238-b283-526905d0598e')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.023 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa3ec3678-31, col_values=(('external_ids', {'iface-id': 'a3ec3678-3134-4696-9139-0330b96780e6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:8a:b0', 'vm-uuid': '78b8c97e-5e87-4f6c-9031-926c30492876'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.024 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.026 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:12:00 compute-0 NetworkManager[55489]: <info>  [1769458320.0265] manager: (tapa3ec3678-31): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.032 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:00 compute-0 nova_compute[183177]: 2026-01-26 20:12:00.033 183181 INFO os_vif [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:8a:b0,bridge_name='br-int',has_traffic_filtering=True,id=a3ec3678-3134-4696-9139-0330b96780e6,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3ec3678-31')
Jan 26 20:12:01 compute-0 openstack_network_exporter[195363]: ERROR   20:12:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:12:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:12:01 compute-0 openstack_network_exporter[195363]: ERROR   20:12:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:12:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:12:01 compute-0 nova_compute[183177]: 2026-01-26 20:12:01.587 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:12:01 compute-0 nova_compute[183177]: 2026-01-26 20:12:01.587 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:12:01 compute-0 nova_compute[183177]: 2026-01-26 20:12:01.587 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] No VIF found with MAC fa:16:3e:e0:8a:b0, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 20:12:01 compute-0 nova_compute[183177]: 2026-01-26 20:12:01.588 183181 INFO nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Using config drive
Jan 26 20:12:02 compute-0 nova_compute[183177]: 2026-01-26 20:12:02.100 183181 WARNING neutronclient.v2_0.client [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:12:02 compute-0 nova_compute[183177]: 2026-01-26 20:12:02.628 183181 INFO nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Creating config drive at /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk.config
Jan 26 20:12:02 compute-0 nova_compute[183177]: 2026-01-26 20:12:02.638 183181 DEBUG oslo_concurrency.processutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpdkjnf3ip execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:12:02 compute-0 nova_compute[183177]: 2026-01-26 20:12:02.780 183181 DEBUG oslo_concurrency.processutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpdkjnf3ip" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:12:02 compute-0 kernel: tapa3ec3678-31: entered promiscuous mode
Jan 26 20:12:02 compute-0 NetworkManager[55489]: <info>  [1769458322.8807] manager: (tapa3ec3678-31): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Jan 26 20:12:02 compute-0 ovn_controller[95396]: 2026-01-26T20:12:02Z|00241|binding|INFO|Claiming lport a3ec3678-3134-4696-9139-0330b96780e6 for this chassis.
Jan 26 20:12:02 compute-0 ovn_controller[95396]: 2026-01-26T20:12:02Z|00242|binding|INFO|a3ec3678-3134-4696-9139-0330b96780e6: Claiming fa:16:3e:e0:8a:b0 10.100.0.4
Jan 26 20:12:02 compute-0 nova_compute[183177]: 2026-01-26 20:12:02.883 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:02.891 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:8a:b0 10.100.0.4'], port_security=['fa:16:3e:e0:8a:b0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '78b8c97e-5e87-4f6c-9031-926c30492876', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '393f035f3d824babb9d76f6e83e4192b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '00852c1f-7776-4984-b01a-e3f5f5d3f1d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03a6e965-0f13-4adb-a0dd-8b518d1d2445, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=a3ec3678-3134-4696-9139-0330b96780e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:12:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:02.892 104672 INFO neutron.agent.ovn.metadata.agent [-] Port a3ec3678-3134-4696-9139-0330b96780e6 in datapath bbde741b-e853-4ad9-b0df-87ed33f347f8 bound to our chassis
Jan 26 20:12:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:02.894 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bbde741b-e853-4ad9-b0df-87ed33f347f8
Jan 26 20:12:02 compute-0 ovn_controller[95396]: 2026-01-26T20:12:02Z|00243|binding|INFO|Setting lport a3ec3678-3134-4696-9139-0330b96780e6 ovn-installed in OVS
Jan 26 20:12:02 compute-0 ovn_controller[95396]: 2026-01-26T20:12:02Z|00244|binding|INFO|Setting lport a3ec3678-3134-4696-9139-0330b96780e6 up in Southbound
Jan 26 20:12:02 compute-0 nova_compute[183177]: 2026-01-26 20:12:02.911 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:02.913 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[24f46b90-ff27-4956-bb58-8f6c8e639da1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:02.914 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbbde741b-e1 in ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 20:12:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:02.917 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbbde741b-e0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 20:12:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:02.917 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[54ad152a-ec16-4254-8b4c-8f59b185147f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:02.918 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[966e9624-e0ca-4733-bda6-599b9e378f5c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:02 compute-0 systemd-udevd[216658]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:12:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:02.939 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[eb40c935-68c0-4ab1-9bd8-fa2ac94f8e90]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:02 compute-0 systemd-machined[154465]: New machine qemu-24-instance-00000020.
Jan 26 20:12:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:02.950 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9b01fb4d-d365-4329-8afe-f285102f3663]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:02 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-00000020.
Jan 26 20:12:02 compute-0 NetworkManager[55489]: <info>  [1769458322.9680] device (tapa3ec3678-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 20:12:02 compute-0 NetworkManager[55489]: <info>  [1769458322.9693] device (tapa3ec3678-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 20:12:02 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:02.996 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[c7ec0076-4b77-4afa-847b-a34c26ac3778]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.003 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[9af72463-c73e-4032-8afc-483c107d8d02]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:03 compute-0 NetworkManager[55489]: <info>  [1769458323.0045] manager: (tapbbde741b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/98)
Jan 26 20:12:03 compute-0 systemd-udevd[216663]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.046 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[74c14fea-761d-4d3c-90bd-08036dcebfa0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.052 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[94077563-5011-4756-bedb-14b5dd413e4f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:03 compute-0 NetworkManager[55489]: <info>  [1769458323.0992] device (tapbbde741b-e0): carrier: link connected
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.114 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[6450d486-cb79-46bf-a3f3-f1dfe91927c6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.144 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[173895eb-d1ae-4427-a606-007245be1c04]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbbde741b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:8a:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578198, 'reachable_time': 24591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216690, 'error': None, 'target': 'ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.172 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[16dc2c58-6648-42ee-8ed3-a60cf9999bd3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:8afc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 578198, 'tstamp': 578198}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216691, 'error': None, 'target': 'ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.199 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[cdcb2dd7-7b24-416e-9867-53e9b66ebd35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbbde741b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:8a:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578198, 'reachable_time': 24591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216692, 'error': None, 'target': 'ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.253 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[29428aeb-a052-4d72-a712-20126ecbccaf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.364 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8256f2bf-e4a4-40dc-9400-1175c73912b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.365 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbbde741b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.366 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.366 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbbde741b-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:12:03 compute-0 nova_compute[183177]: 2026-01-26 20:12:03.367 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:03 compute-0 NetworkManager[55489]: <info>  [1769458323.3687] manager: (tapbbde741b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Jan 26 20:12:03 compute-0 kernel: tapbbde741b-e0: entered promiscuous mode
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.371 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbbde741b-e0, col_values=(('external_ids', {'iface-id': 'f7d8ca99-79c7-4c8b-b8a1-edbf275471a0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:12:03 compute-0 ovn_controller[95396]: 2026-01-26T20:12:03Z|00245|binding|INFO|Releasing lport f7d8ca99-79c7-4c8b-b8a1-edbf275471a0 from this chassis (sb_readonly=0)
Jan 26 20:12:03 compute-0 nova_compute[183177]: 2026-01-26 20:12:03.373 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:03 compute-0 nova_compute[183177]: 2026-01-26 20:12:03.399 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.401 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[bb60d5f7-a010-4180-be54-6f64c9207413]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.402 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.402 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.402 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for bbde741b-e853-4ad9-b0df-87ed33f347f8 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.403 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.404 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e268dc25-6f7f-456d-a121-44ddec429dc6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.404 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.405 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[bff2ce88-63dd-4c08-9a66-b10ff67559e3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.406 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: global
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-bbde741b-e853-4ad9-b0df-87ed33f347f8
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID bbde741b-e853-4ad9-b0df-87ed33f347f8
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 20:12:03 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:03.406 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'env', 'PROCESS_TAG=haproxy-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bbde741b-e853-4ad9-b0df-87ed33f347f8.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 20:12:03 compute-0 nova_compute[183177]: 2026-01-26 20:12:03.554 183181 DEBUG nova.compute.manager [req-60a1c9d1-16d5-4dd6-9ef4-069a3acb28f2 req-f486557e-eec7-4696-9169-30e7d4124ca4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:12:03 compute-0 nova_compute[183177]: 2026-01-26 20:12:03.554 183181 DEBUG oslo_concurrency.lockutils [req-60a1c9d1-16d5-4dd6-9ef4-069a3acb28f2 req-f486557e-eec7-4696-9169-30e7d4124ca4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:12:03 compute-0 nova_compute[183177]: 2026-01-26 20:12:03.554 183181 DEBUG oslo_concurrency.lockutils [req-60a1c9d1-16d5-4dd6-9ef4-069a3acb28f2 req-f486557e-eec7-4696-9169-30e7d4124ca4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:12:03 compute-0 nova_compute[183177]: 2026-01-26 20:12:03.555 183181 DEBUG oslo_concurrency.lockutils [req-60a1c9d1-16d5-4dd6-9ef4-069a3acb28f2 req-f486557e-eec7-4696-9169-30e7d4124ca4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:12:03 compute-0 nova_compute[183177]: 2026-01-26 20:12:03.555 183181 DEBUG nova.compute.manager [req-60a1c9d1-16d5-4dd6-9ef4-069a3acb28f2 req-f486557e-eec7-4696-9169-30e7d4124ca4 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Processing event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:12:03 compute-0 nova_compute[183177]: 2026-01-26 20:12:03.555 183181 DEBUG nova.compute.manager [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:12:03 compute-0 nova_compute[183177]: 2026-01-26 20:12:03.560 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 20:12:03 compute-0 nova_compute[183177]: 2026-01-26 20:12:03.563 183181 INFO nova.virt.libvirt.driver [-] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Instance spawned successfully.
Jan 26 20:12:03 compute-0 nova_compute[183177]: 2026-01-26 20:12:03.564 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 20:12:03 compute-0 podman[216731]: 2026-01-26 20:12:03.879662354 +0000 UTC m=+0.086946244 container create 6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120)
Jan 26 20:12:03 compute-0 podman[216731]: 2026-01-26 20:12:03.8405361 +0000 UTC m=+0.047820130 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 20:12:03 compute-0 systemd[1]: Started libpod-conmon-6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1.scope.
Jan 26 20:12:03 compute-0 systemd[1]: Started libcrun container.
Jan 26 20:12:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/678d6f90e92556d0b0f195b7921e34565090b7b1023651016caa727af575e07a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 20:12:03 compute-0 podman[216731]: 2026-01-26 20:12:03.979836744 +0000 UTC m=+0.187120644 container init 6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 20:12:03 compute-0 podman[216731]: 2026-01-26 20:12:03.989815792 +0000 UTC m=+0.197099682 container start 6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 26 20:12:04 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216746]: [NOTICE]   (216750) : New worker (216752) forked
Jan 26 20:12:04 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216746]: [NOTICE]   (216750) : Loading success.
Jan 26 20:12:04 compute-0 nova_compute[183177]: 2026-01-26 20:12:04.080 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:12:04 compute-0 nova_compute[183177]: 2026-01-26 20:12:04.081 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:12:04 compute-0 nova_compute[183177]: 2026-01-26 20:12:04.082 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:12:04 compute-0 nova_compute[183177]: 2026-01-26 20:12:04.083 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:12:04 compute-0 nova_compute[183177]: 2026-01-26 20:12:04.084 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:12:04 compute-0 nova_compute[183177]: 2026-01-26 20:12:04.085 183181 DEBUG nova.virt.libvirt.driver [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:12:04 compute-0 nova_compute[183177]: 2026-01-26 20:12:04.391 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:04 compute-0 nova_compute[183177]: 2026-01-26 20:12:04.599 183181 INFO nova.compute.manager [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Took 9.52 seconds to spawn the instance on the hypervisor.
Jan 26 20:12:04 compute-0 nova_compute[183177]: 2026-01-26 20:12:04.600 183181 DEBUG nova.compute.manager [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 20:12:05 compute-0 nova_compute[183177]: 2026-01-26 20:12:05.026 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:05 compute-0 nova_compute[183177]: 2026-01-26 20:12:05.144 183181 INFO nova.compute.manager [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Took 14.79 seconds to build instance.
Jan 26 20:12:05 compute-0 nova_compute[183177]: 2026-01-26 20:12:05.651 183181 DEBUG oslo_concurrency.lockutils [None req-8a3c2019-7e80-4867-8529-dfb70c5a9d33 7a3a0c805ad14e438b8e8a90e16d8d02 393f035f3d824babb9d76f6e83e4192b - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.322s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:12:05 compute-0 nova_compute[183177]: 2026-01-26 20:12:05.657 183181 DEBUG nova.compute.manager [req-ca4e4866-8b04-490a-9e6d-aef6ce7edc3e req-9fedc7ad-5749-447a-a3a8-8285e942abf1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:12:05 compute-0 nova_compute[183177]: 2026-01-26 20:12:05.657 183181 DEBUG oslo_concurrency.lockutils [req-ca4e4866-8b04-490a-9e6d-aef6ce7edc3e req-9fedc7ad-5749-447a-a3a8-8285e942abf1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:12:05 compute-0 nova_compute[183177]: 2026-01-26 20:12:05.658 183181 DEBUG oslo_concurrency.lockutils [req-ca4e4866-8b04-490a-9e6d-aef6ce7edc3e req-9fedc7ad-5749-447a-a3a8-8285e942abf1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:12:05 compute-0 nova_compute[183177]: 2026-01-26 20:12:05.658 183181 DEBUG oslo_concurrency.lockutils [req-ca4e4866-8b04-490a-9e6d-aef6ce7edc3e req-9fedc7ad-5749-447a-a3a8-8285e942abf1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:12:05 compute-0 nova_compute[183177]: 2026-01-26 20:12:05.659 183181 DEBUG nova.compute.manager [req-ca4e4866-8b04-490a-9e6d-aef6ce7edc3e req-9fedc7ad-5749-447a-a3a8-8285e942abf1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] No waiting events found dispatching network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:12:05 compute-0 nova_compute[183177]: 2026-01-26 20:12:05.660 183181 WARNING nova.compute.manager [req-ca4e4866-8b04-490a-9e6d-aef6ce7edc3e req-9fedc7ad-5749-447a-a3a8-8285e942abf1 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received unexpected event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 for instance with vm_state active and task_state None.
Jan 26 20:12:06 compute-0 nova_compute[183177]: 2026-01-26 20:12:06.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:12:06 compute-0 sshd-session[216761]: Connection closed by authenticating user root 142.93.140.142 port 32866 [preauth]
Jan 26 20:12:08 compute-0 sshd-session[216763]: Invalid user oracle from 193.32.162.151 port 43128
Jan 26 20:12:08 compute-0 sshd-session[216763]: Connection closed by invalid user oracle 193.32.162.151 port 43128 [preauth]
Jan 26 20:12:09 compute-0 nova_compute[183177]: 2026-01-26 20:12:09.396 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:10 compute-0 nova_compute[183177]: 2026-01-26 20:12:10.030 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:14 compute-0 nova_compute[183177]: 2026-01-26 20:12:14.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:12:14 compute-0 nova_compute[183177]: 2026-01-26 20:12:14.447 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:15 compute-0 nova_compute[183177]: 2026-01-26 20:12:15.034 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:15 compute-0 ovn_controller[95396]: 2026-01-26T20:12:15Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e0:8a:b0 10.100.0.4
Jan 26 20:12:15 compute-0 ovn_controller[95396]: 2026-01-26T20:12:15Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e0:8a:b0 10.100.0.4
Jan 26 20:12:16 compute-0 sshd-session[216773]: Connection closed by authenticating user root 188.166.116.149 port 53480 [preauth]
Jan 26 20:12:19 compute-0 nova_compute[183177]: 2026-01-26 20:12:19.150 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:12:19 compute-0 nova_compute[183177]: 2026-01-26 20:12:19.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:12:19 compute-0 nova_compute[183177]: 2026-01-26 20:12:19.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:12:19 compute-0 nova_compute[183177]: 2026-01-26 20:12:19.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:12:19 compute-0 nova_compute[183177]: 2026-01-26 20:12:19.450 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:20 compute-0 nova_compute[183177]: 2026-01-26 20:12:20.037 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:20 compute-0 nova_compute[183177]: 2026-01-26 20:12:20.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:12:20 compute-0 podman[216777]: 2026-01-26 20:12:20.357194888 +0000 UTC m=+0.084562460 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Jan 26 20:12:20 compute-0 podman[216776]: 2026-01-26 20:12:20.358544014 +0000 UTC m=+0.091719622 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, name=ubi9-minimal)
Jan 26 20:12:20 compute-0 podman[216775]: 2026-01-26 20:12:20.425405396 +0000 UTC m=+0.162963932 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 26 20:12:20 compute-0 nova_compute[183177]: 2026-01-26 20:12:20.672 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:12:20 compute-0 nova_compute[183177]: 2026-01-26 20:12:20.672 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:12:20 compute-0 nova_compute[183177]: 2026-01-26 20:12:20.672 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:12:20 compute-0 nova_compute[183177]: 2026-01-26 20:12:20.673 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:12:21 compute-0 nova_compute[183177]: 2026-01-26 20:12:21.736 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:12:21 compute-0 nova_compute[183177]: 2026-01-26 20:12:21.842 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:12:21 compute-0 nova_compute[183177]: 2026-01-26 20:12:21.844 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:12:21 compute-0 nova_compute[183177]: 2026-01-26 20:12:21.919 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:12:22 compute-0 nova_compute[183177]: 2026-01-26 20:12:22.092 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:12:22 compute-0 nova_compute[183177]: 2026-01-26 20:12:22.094 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:12:22 compute-0 nova_compute[183177]: 2026-01-26 20:12:22.132 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:12:22 compute-0 nova_compute[183177]: 2026-01-26 20:12:22.133 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5565MB free_disk=73.06143951416016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:12:22 compute-0 nova_compute[183177]: 2026-01-26 20:12:22.133 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:12:22 compute-0 nova_compute[183177]: 2026-01-26 20:12:22.133 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:12:23 compute-0 nova_compute[183177]: 2026-01-26 20:12:23.198 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance 78b8c97e-5e87-4f6c-9031-926c30492876 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 20:12:23 compute-0 nova_compute[183177]: 2026-01-26 20:12:23.199 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:12:23 compute-0 nova_compute[183177]: 2026-01-26 20:12:23.199 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:12:22 up  1:36,  0 user,  load average: 0.25, 0.26, 0.27\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_393f035f3d824babb9d76f6e83e4192b': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:12:23 compute-0 nova_compute[183177]: 2026-01-26 20:12:23.223 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing inventories for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 20:12:23 compute-0 nova_compute[183177]: 2026-01-26 20:12:23.242 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating ProviderTree inventory for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 20:12:23 compute-0 nova_compute[183177]: 2026-01-26 20:12:23.243 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 20:12:23 compute-0 nova_compute[183177]: 2026-01-26 20:12:23.260 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing aggregate associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 20:12:23 compute-0 nova_compute[183177]: 2026-01-26 20:12:23.296 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing trait associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_SCSI,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 20:12:23 compute-0 nova_compute[183177]: 2026-01-26 20:12:23.357 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:12:23 compute-0 nova_compute[183177]: 2026-01-26 20:12:23.868 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:12:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:24.112 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:12:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:24.112 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:12:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:24.112 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:12:24 compute-0 nova_compute[183177]: 2026-01-26 20:12:24.376 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:12:24 compute-0 nova_compute[183177]: 2026-01-26 20:12:24.376 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.243s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:12:24 compute-0 nova_compute[183177]: 2026-01-26 20:12:24.497 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:24.926 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:12:24 compute-0 nova_compute[183177]: 2026-01-26 20:12:24.926 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:24.927 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:12:25 compute-0 nova_compute[183177]: 2026-01-26 20:12:25.038 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:26 compute-0 podman[216846]: 2026-01-26 20:12:26.348199134 +0000 UTC m=+0.096984274 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 20:12:26 compute-0 nova_compute[183177]: 2026-01-26 20:12:26.377 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:12:27 compute-0 nova_compute[183177]: 2026-01-26 20:12:27.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:12:29 compute-0 nova_compute[183177]: 2026-01-26 20:12:29.544 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:29 compute-0 podman[192499]: time="2026-01-26T20:12:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:12:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:12:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:12:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:12:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2647 "" "Go-http-client/1.1"
Jan 26 20:12:30 compute-0 nova_compute[183177]: 2026-01-26 20:12:30.039 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:30 compute-0 nova_compute[183177]: 2026-01-26 20:12:30.148 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:12:31 compute-0 openstack_network_exporter[195363]: ERROR   20:12:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:12:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:12:31 compute-0 openstack_network_exporter[195363]: ERROR   20:12:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:12:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:12:32 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:32.929 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:12:34 compute-0 nova_compute[183177]: 2026-01-26 20:12:34.595 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:35 compute-0 nova_compute[183177]: 2026-01-26 20:12:35.041 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:39 compute-0 nova_compute[183177]: 2026-01-26 20:12:39.643 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:40 compute-0 nova_compute[183177]: 2026-01-26 20:12:40.044 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:41 compute-0 nova_compute[183177]: 2026-01-26 20:12:41.874 183181 DEBUG nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Check if temp file /var/lib/nova/instances/tmp2vue_bwg exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Jan 26 20:12:41 compute-0 nova_compute[183177]: 2026-01-26 20:12:41.879 183181 DEBUG nova.compute.manager [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2vue_bwg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='78b8c97e-5e87-4f6c-9031-926c30492876',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Jan 26 20:12:43 compute-0 sshd-session[216872]: Connection closed by authenticating user root 142.93.140.142 port 45142 [preauth]
Jan 26 20:12:44 compute-0 nova_compute[183177]: 2026-01-26 20:12:44.692 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:45 compute-0 nova_compute[183177]: 2026-01-26 20:12:45.045 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:46 compute-0 nova_compute[183177]: 2026-01-26 20:12:46.198 183181 DEBUG oslo_concurrency.processutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:12:46 compute-0 nova_compute[183177]: 2026-01-26 20:12:46.292 183181 DEBUG oslo_concurrency.processutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:12:46 compute-0 nova_compute[183177]: 2026-01-26 20:12:46.294 183181 DEBUG oslo_concurrency.processutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:12:46 compute-0 nova_compute[183177]: 2026-01-26 20:12:46.370 183181 DEBUG oslo_concurrency.processutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:12:46 compute-0 nova_compute[183177]: 2026-01-26 20:12:46.372 183181 DEBUG nova.compute.manager [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Preparing to wait for external event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:12:46 compute-0 nova_compute[183177]: 2026-01-26 20:12:46.372 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:12:46 compute-0 nova_compute[183177]: 2026-01-26 20:12:46.373 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:12:46 compute-0 nova_compute[183177]: 2026-01-26 20:12:46.373 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:12:49 compute-0 nova_compute[183177]: 2026-01-26 20:12:49.695 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:50 compute-0 nova_compute[183177]: 2026-01-26 20:12:50.046 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:51 compute-0 podman[216882]: 2026-01-26 20:12:51.331347639 +0000 UTC m=+0.065989239 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=)
Jan 26 20:12:51 compute-0 podman[216889]: 2026-01-26 20:12:51.333020724 +0000 UTC m=+0.060999205 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 20:12:51 compute-0 podman[216881]: 2026-01-26 20:12:51.362261952 +0000 UTC m=+0.106624334 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 26 20:12:52 compute-0 nova_compute[183177]: 2026-01-26 20:12:52.056 183181 DEBUG nova.compute.manager [req-822c14f9-560e-4d05-954a-4b6748aec96f req-9031c103-5bb8-4751-8193-bc94fa2d1b43 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-unplugged-a3ec3678-3134-4696-9139-0330b96780e6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:12:52 compute-0 nova_compute[183177]: 2026-01-26 20:12:52.057 183181 DEBUG oslo_concurrency.lockutils [req-822c14f9-560e-4d05-954a-4b6748aec96f req-9031c103-5bb8-4751-8193-bc94fa2d1b43 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:12:52 compute-0 nova_compute[183177]: 2026-01-26 20:12:52.057 183181 DEBUG oslo_concurrency.lockutils [req-822c14f9-560e-4d05-954a-4b6748aec96f req-9031c103-5bb8-4751-8193-bc94fa2d1b43 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:12:52 compute-0 nova_compute[183177]: 2026-01-26 20:12:52.057 183181 DEBUG oslo_concurrency.lockutils [req-822c14f9-560e-4d05-954a-4b6748aec96f req-9031c103-5bb8-4751-8193-bc94fa2d1b43 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:12:52 compute-0 nova_compute[183177]: 2026-01-26 20:12:52.058 183181 DEBUG nova.compute.manager [req-822c14f9-560e-4d05-954a-4b6748aec96f req-9031c103-5bb8-4751-8193-bc94fa2d1b43 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] No event matching network-vif-unplugged-a3ec3678-3134-4696-9139-0330b96780e6 in dict_keys([('network-vif-plugged', 'a3ec3678-3134-4696-9139-0330b96780e6')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Jan 26 20:12:52 compute-0 nova_compute[183177]: 2026-01-26 20:12:52.058 183181 DEBUG nova.compute.manager [req-822c14f9-560e-4d05-954a-4b6748aec96f req-9031c103-5bb8-4751-8193-bc94fa2d1b43 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-unplugged-a3ec3678-3134-4696-9139-0330b96780e6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:12:52 compute-0 nova_compute[183177]: 2026-01-26 20:12:52.898 183181 INFO nova.compute.manager [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Took 6.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.115 183181 DEBUG nova.compute.manager [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.116 183181 DEBUG oslo_concurrency.lockutils [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.116 183181 DEBUG oslo_concurrency.lockutils [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.116 183181 DEBUG oslo_concurrency.lockutils [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.117 183181 DEBUG nova.compute.manager [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Processing event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.117 183181 DEBUG nova.compute.manager [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-changed-a3ec3678-3134-4696-9139-0330b96780e6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.117 183181 DEBUG nova.compute.manager [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Refreshing instance network info cache due to event network-changed-a3ec3678-3134-4696-9139-0330b96780e6. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.117 183181 DEBUG oslo_concurrency.lockutils [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-78b8c97e-5e87-4f6c-9031-926c30492876" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.117 183181 DEBUG oslo_concurrency.lockutils [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-78b8c97e-5e87-4f6c-9031-926c30492876" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.118 183181 DEBUG nova.network.neutron [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Refreshing network info cache for port a3ec3678-3134-4696-9139-0330b96780e6 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.119 183181 DEBUG nova.compute.manager [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:12:54 compute-0 ovn_controller[95396]: 2026-01-26T20:12:54Z|00246|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.626 183181 WARNING neutronclient.v2_0.client [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.634 183181 DEBUG nova.compute.manager [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2vue_bwg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='78b8c97e-5e87-4f6c-9031-926c30492876',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(895d5a21-7379-4c8e-b446-abe87e215269),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Jan 26 20:12:54 compute-0 nova_compute[183177]: 2026-01-26 20:12:54.734 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.048 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.161 183181 WARNING neutronclient.v2_0.client [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.164 183181 DEBUG nova.objects.instance [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lazy-loading 'migration_context' on Instance uuid 78b8c97e-5e87-4f6c-9031-926c30492876 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.165 183181 DEBUG nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.167 183181 DEBUG nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.167 183181 DEBUG nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.404 183181 DEBUG nova.network.neutron [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Updated VIF entry in instance network info cache for port a3ec3678-3134-4696-9139-0330b96780e6. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.404 183181 DEBUG nova.network.neutron [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Updating instance_info_cache with network_info: [{"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.670 183181 DEBUG nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.671 183181 DEBUG nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.678 183181 DEBUG nova.virt.libvirt.vif [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:11:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-702608838',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-702608838',id=32,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:12:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='393f035f3d824babb9d76f6e83e4192b',ramdisk_id='',reservation_id='r-yiggrdr3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1136104294',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:12:04Z,user_data=None,user_id='7a3a0c805ad14e438b8e8a90e16d8d02',uuid=78b8c97e-5e87-4f6c-9031-926c30492876,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.679 183181 DEBUG nova.network.os_vif_util [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.680 183181 DEBUG nova.network.os_vif_util [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:8a:b0,bridge_name='br-int',has_traffic_filtering=True,id=a3ec3678-3134-4696-9139-0330b96780e6,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3ec3678-31') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.681 183181 DEBUG nova.virt.libvirt.migration [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Updating guest XML with vif config: <interface type="ethernet">
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <mac address="fa:16:3e:e0:8a:b0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <model type="virtio"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <mtu size="1442"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <target dev="tapa3ec3678-31"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]: </interface>
Jan 26 20:12:55 compute-0 nova_compute[183177]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.682 183181 DEBUG nova.virt.libvirt.migration [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <name>instance-00000020</name>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <uuid>78b8c97e-5e87-4f6c-9031-926c30492876</uuid>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-702608838</nova:name>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:11:59</nova:creationTime>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:12:55 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:12:55 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:user uuid="7a3a0c805ad14e438b8e8a90e16d8d02">tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin</nova:user>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:project uuid="393f035f3d824babb9d76f6e83e4192b">tempest-TestExecuteZoneMigrationStrategy-1136104294</nova:project>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:port uuid="a3ec3678-3134-4696-9139-0330b96780e6">
Jan 26 20:12:55 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <system>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="serial">78b8c97e-5e87-4f6c-9031-926c30492876</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="uuid">78b8c97e-5e87-4f6c-9031-926c30492876</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </system>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <os>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </os>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <features>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </features>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk.config"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:e0:8a:b0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa3ec3678-31"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/console.log" append="off"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </target>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/console.log" append="off"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </console>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </input>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <video>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </video>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]: </domain>
Jan 26 20:12:55 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.684 183181 DEBUG nova.virt.libvirt.migration [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <name>instance-00000020</name>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <uuid>78b8c97e-5e87-4f6c-9031-926c30492876</uuid>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-702608838</nova:name>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:11:59</nova:creationTime>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:12:55 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:12:55 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:user uuid="7a3a0c805ad14e438b8e8a90e16d8d02">tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin</nova:user>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:project uuid="393f035f3d824babb9d76f6e83e4192b">tempest-TestExecuteZoneMigrationStrategy-1136104294</nova:project>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:port uuid="a3ec3678-3134-4696-9139-0330b96780e6">
Jan 26 20:12:55 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <system>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="serial">78b8c97e-5e87-4f6c-9031-926c30492876</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="uuid">78b8c97e-5e87-4f6c-9031-926c30492876</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </system>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <os>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </os>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <features>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </features>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk.config"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:e0:8a:b0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa3ec3678-31"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/console.log" append="off"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </target>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/console.log" append="off"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </console>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </input>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <video>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </video>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]: </domain>
Jan 26 20:12:55 compute-0 nova_compute[183177]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.685 183181 DEBUG nova.virt.libvirt.migration [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] _update_pci_xml output xml=<domain type="kvm">
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <name>instance-00000020</name>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <uuid>78b8c97e-5e87-4f6c-9031-926c30492876</uuid>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-702608838</nova:name>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:11:59</nova:creationTime>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:12:55 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:12:55 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:user uuid="7a3a0c805ad14e438b8e8a90e16d8d02">tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin</nova:user>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:project uuid="393f035f3d824babb9d76f6e83e4192b">tempest-TestExecuteZoneMigrationStrategy-1136104294</nova:project>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <nova:port uuid="a3ec3678-3134-4696-9139-0330b96780e6">
Jan 26 20:12:55 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <memory unit="KiB">131072</memory>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <currentMemory unit="KiB">131072</currentMemory>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <vcpu placement="static">1</vcpu>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <resource>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <partition>/machine</partition>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </resource>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <system>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="serial">78b8c97e-5e87-4f6c-9031-926c30492876</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="uuid">78b8c97e-5e87-4f6c-9031-926c30492876</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </system>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <os>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </os>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <features>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <vmcoreinfo state="on"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </features>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact" check="partial">
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <model fallback="allow">Nehalem</model>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <on_poweroff>destroy</on_poweroff>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <on_reboot>restart</on_reboot>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <on_crash>destroy</on_crash>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/disk.config"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <readonly/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="0" model="pcie-root"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="1" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="1" port="0x10"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="2" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="2" port="0x11"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="3" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="3" port="0x12"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="4" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="4" port="0x13"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="5" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="5" port="0x14"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="6" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="6" port="0x15"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="7" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="7" port="0x16"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="8" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="8" port="0x17"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="9" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="9" port="0x18"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="10" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="10" port="0x19"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="11" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="11" port="0x1a"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="12" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="12" port="0x1b"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="13" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="13" port="0x1c"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="14" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="14" port="0x1d"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="15" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="15" port="0x1e"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="16" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="16" port="0x1f"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="17" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="17" port="0x20"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="18" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="18" port="0x21"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="19" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="19" port="0x22"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="20" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="20" port="0x23"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="21" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="21" port="0x24"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="22" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="22" port="0x25"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="23" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="23" port="0x26"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="24" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="24" port="0x27"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="25" model="pcie-root-port">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-root-port"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target chassis="25" port="0x28"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model name="pcie-pci-bridge"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="usb" index="0" model="piix3-uhci">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <controller type="sata" index="0">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </controller>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <interface type="ethernet"><mac address="fa:16:3e:e0:8a:b0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapa3ec3678-31"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </interface><serial type="pty">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/console.log" append="off"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target type="isa-serial" port="0">
Jan 26 20:12:55 compute-0 nova_compute[183177]:         <model name="isa-serial"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       </target>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <console type="pty">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876/console.log" append="off"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <target type="serial" port="0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </console>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="usb" bus="0" port="1"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </input>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <input type="mouse" bus="ps2"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <listen type="address" address="::"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </graphics>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <video>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <model type="virtio" heads="1" primary="yes"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </video>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:12:55 compute-0 nova_compute[183177]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:12:55 compute-0 nova_compute[183177]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Jan 26 20:12:55 compute-0 nova_compute[183177]: </domain>
Jan 26 20:12:55 compute-0 nova_compute[183177]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.685 183181 DEBUG nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Jan 26 20:12:55 compute-0 nova_compute[183177]: 2026-01-26 20:12:55.912 183181 DEBUG oslo_concurrency.lockutils [req-fdfe87c3-e761-4823-bb29-33fc1e91029f req-355df272-71e6-4d55-8b30-c1ace39576be 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-78b8c97e-5e87-4f6c-9031-926c30492876" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:12:56 compute-0 nova_compute[183177]: 2026-01-26 20:12:56.174 183181 DEBUG nova.virt.libvirt.migration [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:12:56 compute-0 nova_compute[183177]: 2026-01-26 20:12:56.174 183181 INFO nova.virt.libvirt.migration [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 26 20:12:57 compute-0 nova_compute[183177]: 2026-01-26 20:12:57.194 183181 INFO nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 26 20:12:57 compute-0 podman[216948]: 2026-01-26 20:12:57.364393519 +0000 UTC m=+0.064720335 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:12:57 compute-0 nova_compute[183177]: 2026-01-26 20:12:57.698 183181 DEBUG nova.virt.libvirt.migration [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:12:57 compute-0 nova_compute[183177]: 2026-01-26 20:12:57.699 183181 DEBUG nova.virt.libvirt.migration [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.203 183181 DEBUG nova.virt.libvirt.migration [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.204 183181 DEBUG nova.virt.libvirt.migration [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Jan 26 20:12:58 compute-0 kernel: tapa3ec3678-31 (unregistering): left promiscuous mode
Jan 26 20:12:58 compute-0 NetworkManager[55489]: <info>  [1769458378.5328] device (tapa3ec3678-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 20:12:58 compute-0 ovn_controller[95396]: 2026-01-26T20:12:58Z|00247|binding|INFO|Releasing lport a3ec3678-3134-4696-9139-0330b96780e6 from this chassis (sb_readonly=0)
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.539 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:58 compute-0 ovn_controller[95396]: 2026-01-26T20:12:58Z|00248|binding|INFO|Setting lport a3ec3678-3134-4696-9139-0330b96780e6 down in Southbound
Jan 26 20:12:58 compute-0 ovn_controller[95396]: 2026-01-26T20:12:58Z|00249|binding|INFO|Removing iface tapa3ec3678-31 ovn-installed in OVS
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.547 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:8a:b0 10.100.0.4'], port_security=['fa:16:3e:e0:8a:b0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'c86a094a-ada8-46ad-9c23-c857e9a7b834'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '78b8c97e-5e87-4f6c-9031-926c30492876', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '393f035f3d824babb9d76f6e83e4192b', 'neutron:revision_number': '10', 'neutron:security_group_ids': '00852c1f-7776-4984-b01a-e3f5f5d3f1d7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03a6e965-0f13-4adb-a0dd-8b518d1d2445, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=a3ec3678-3134-4696-9139-0330b96780e6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.548 104672 INFO neutron.agent.ovn.metadata.agent [-] Port a3ec3678-3134-4696-9139-0330b96780e6 in datapath bbde741b-e853-4ad9-b0df-87ed33f347f8 unbound from our chassis
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.549 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bbde741b-e853-4ad9-b0df-87ed33f347f8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.550 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[7989c464-666e-4cff-a5df-0a1a45a8f1c4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.551 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8 namespace which is not needed anymore
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.556 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:58 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000020.scope: Deactivated successfully.
Jan 26 20:12:58 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000020.scope: Consumed 14.883s CPU time.
Jan 26 20:12:58 compute-0 systemd-machined[154465]: Machine qemu-24-instance-00000020 terminated.
Jan 26 20:12:58 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216746]: [NOTICE]   (216750) : haproxy version is 3.0.5-8e879a5
Jan 26 20:12:58 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216746]: [NOTICE]   (216750) : path to executable is /usr/sbin/haproxy
Jan 26 20:12:58 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216746]: [WARNING]  (216750) : Exiting Master process...
Jan 26 20:12:58 compute-0 podman[217009]: 2026-01-26 20:12:58.699494172 +0000 UTC m=+0.041255133 container kill 6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 20:12:58 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216746]: [ALERT]    (216750) : Current worker (216752) exited with code 143 (Terminated)
Jan 26 20:12:58 compute-0 neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8[216746]: [WARNING]  (216750) : All workers exited. Exiting... (0)
Jan 26 20:12:58 compute-0 sshd-session[216983]: Connection closed by authenticating user root 188.166.116.149 port 44994 [preauth]
Jan 26 20:12:58 compute-0 systemd[1]: libpod-6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1.scope: Deactivated successfully.
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.730 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.736 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:58 compute-0 podman[217024]: 2026-01-26 20:12:58.754505204 +0000 UTC m=+0.030628616 container died 6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.765 183181 DEBUG nova.virt.libvirt.guest [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.765 183181 INFO nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Migration operation has completed
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.765 183181 INFO nova.compute.manager [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] _post_live_migration() is started..
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.768 183181 DEBUG nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.769 183181 DEBUG nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.769 183181 DEBUG nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.781 183181 WARNING neutronclient.v2_0.client [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.783 183181 WARNING neutronclient.v2_0.client [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:12:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1-userdata-shm.mount: Deactivated successfully.
Jan 26 20:12:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-678d6f90e92556d0b0f195b7921e34565090b7b1023651016caa727af575e07a-merged.mount: Deactivated successfully.
Jan 26 20:12:58 compute-0 podman[217024]: 2026-01-26 20:12:58.792810086 +0000 UTC m=+0.068933468 container cleanup 6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 20:12:58 compute-0 systemd[1]: libpod-conmon-6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1.scope: Deactivated successfully.
Jan 26 20:12:58 compute-0 podman[217026]: 2026-01-26 20:12:58.810687709 +0000 UTC m=+0.077059808 container remove 6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.816 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f17c25cd-4386-4e32-84b1-48e545453bd9]: (4, ("Mon Jan 26 08:12:58 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8 (6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1)\n6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1\nMon Jan 26 08:12:58 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8 (6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1)\n6237cd842bcf6a3dd5d21cce0e460034b824f1b0d8f583933c8a080f8b2c10b1\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.817 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[284ad326-cadb-46b8-9eb8-b4d8c09e048c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.818 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bbde741b-e853-4ad9-b0df-87ed33f347f8.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.818 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[ab0ebca8-1ffe-4cc3-a131-266516b2fdd2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.819 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbbde741b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.820 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:58 compute-0 kernel: tapbbde741b-e0: left promiscuous mode
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.833 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.835 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[843e018d-b8e8-4dd7-83cf-aeed7ea02232]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.842 183181 DEBUG nova.compute.manager [req-d19c1846-ef0d-469f-9697-2119a4e2a805 req-14344729-3b42-447e-9177-2d677da3c080 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-unplugged-a3ec3678-3134-4696-9139-0330b96780e6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.843 183181 DEBUG oslo_concurrency.lockutils [req-d19c1846-ef0d-469f-9697-2119a4e2a805 req-14344729-3b42-447e-9177-2d677da3c080 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.843 183181 DEBUG oslo_concurrency.lockutils [req-d19c1846-ef0d-469f-9697-2119a4e2a805 req-14344729-3b42-447e-9177-2d677da3c080 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.843 183181 DEBUG oslo_concurrency.lockutils [req-d19c1846-ef0d-469f-9697-2119a4e2a805 req-14344729-3b42-447e-9177-2d677da3c080 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.843 183181 DEBUG nova.compute.manager [req-d19c1846-ef0d-469f-9697-2119a4e2a805 req-14344729-3b42-447e-9177-2d677da3c080 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] No waiting events found dispatching network-vif-unplugged-a3ec3678-3134-4696-9139-0330b96780e6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:12:58 compute-0 nova_compute[183177]: 2026-01-26 20:12:58.843 183181 DEBUG nova.compute.manager [req-d19c1846-ef0d-469f-9697-2119a4e2a805 req-14344729-3b42-447e-9177-2d677da3c080 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-unplugged-a3ec3678-3134-4696-9139-0330b96780e6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.850 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd804be-0795-478a-9971-15bc37fecc26]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.851 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[3810739e-e3df-44ae-9b7d-aee4ffffd920]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.865 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[4d633a21-c893-40b0-87be-27789672f824]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578187, 'reachable_time': 32948, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217073, 'error': None, 'target': 'ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.866 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bbde741b-e853-4ad9-b0df-87ed33f347f8 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 20:12:58 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:12:58.867 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae58c96-55c4-4548-8690-7163bde9ccc7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:12:58 compute-0 systemd[1]: run-netns-ovnmeta\x2dbbde741b\x2de853\x2d4ad9\x2db0df\x2d87ed33f347f8.mount: Deactivated successfully.
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.251 183181 DEBUG nova.network.neutron [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Activated binding for port a3ec3678-3134-4696-9139-0330b96780e6 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.251 183181 DEBUG nova.compute.manager [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.252 183181 DEBUG nova.virt.libvirt.vif [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:11:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-702608838',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-702608838',id=32,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:12:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='393f035f3d824babb9d76f6e83e4192b',ramdisk_id='',reservation_id='r-yiggrdr3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1136104294',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1136104294-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:12:37Z,user_data=None,user_id='7a3a0c805ad14e438b8e8a90e16d8d02',uuid=78b8c97e-5e87-4f6c-9031-926c30492876,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.252 183181 DEBUG nova.network.os_vif_util [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converting VIF {"id": "a3ec3678-3134-4696-9139-0330b96780e6", "address": "fa:16:3e:e0:8a:b0", "network": {"id": "bbde741b-e853-4ad9-b0df-87ed33f347f8", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1977594762-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b4d84bb16e02476fb48c432b9e91f9fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa3ec3678-31", "ovs_interfaceid": "a3ec3678-3134-4696-9139-0330b96780e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.253 183181 DEBUG nova.network.os_vif_util [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:8a:b0,bridge_name='br-int',has_traffic_filtering=True,id=a3ec3678-3134-4696-9139-0330b96780e6,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3ec3678-31') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.253 183181 DEBUG os_vif [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:8a:b0,bridge_name='br-int',has_traffic_filtering=True,id=a3ec3678-3134-4696-9139-0330b96780e6,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3ec3678-31') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.255 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.255 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3ec3678-31, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.256 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.258 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.259 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.259 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=29105b58-9855-4238-b283-526905d0598e) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.260 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.261 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.264 183181 INFO os_vif [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:8a:b0,bridge_name='br-int',has_traffic_filtering=True,id=a3ec3678-3134-4696-9139-0330b96780e6,network=Network(bbde741b-e853-4ad9-b0df-87ed33f347f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa3ec3678-31')
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.264 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.264 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.265 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.265 183181 DEBUG nova.compute.manager [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.265 183181 INFO nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Deleting instance files /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876_del
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.266 183181 INFO nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Deletion of /var/lib/nova/instances/78b8c97e-5e87-4f6c-9031-926c30492876_del complete
Jan 26 20:12:59 compute-0 nova_compute[183177]: 2026-01-26 20:12:59.772 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:12:59 compute-0 podman[192499]: time="2026-01-26T20:12:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:12:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:12:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:12:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:12:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.952 183181 DEBUG nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.952 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.952 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.952 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.952 183181 DEBUG nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] No waiting events found dispatching network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.953 183181 WARNING nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received unexpected event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 for instance with vm_state active and task_state migrating.
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.953 183181 DEBUG nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-unplugged-a3ec3678-3134-4696-9139-0330b96780e6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.953 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.953 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.953 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.954 183181 DEBUG nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] No waiting events found dispatching network-vif-unplugged-a3ec3678-3134-4696-9139-0330b96780e6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.954 183181 DEBUG nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-unplugged-a3ec3678-3134-4696-9139-0330b96780e6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.954 183181 DEBUG nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-unplugged-a3ec3678-3134-4696-9139-0330b96780e6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.954 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.954 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.954 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.955 183181 DEBUG nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] No waiting events found dispatching network-vif-unplugged-a3ec3678-3134-4696-9139-0330b96780e6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.955 183181 DEBUG nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-unplugged-a3ec3678-3134-4696-9139-0330b96780e6 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.955 183181 DEBUG nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.955 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.955 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.956 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.956 183181 DEBUG nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] No waiting events found dispatching network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.956 183181 WARNING nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received unexpected event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 for instance with vm_state active and task_state migrating.
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.956 183181 DEBUG nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.956 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.956 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.957 183181 DEBUG oslo_concurrency.lockutils [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.957 183181 DEBUG nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] No waiting events found dispatching network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:13:00 compute-0 nova_compute[183177]: 2026-01-26 20:13:00.957 183181 WARNING nova.compute.manager [req-1ab39889-3b15-467c-b474-c3ab1373071c req-8b9cc11d-16f6-48d6-be10-db6ccaaa9425 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Received unexpected event network-vif-plugged-a3ec3678-3134-4696-9139-0330b96780e6 for instance with vm_state active and task_state migrating.
Jan 26 20:13:01 compute-0 openstack_network_exporter[195363]: ERROR   20:13:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:13:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:13:01 compute-0 openstack_network_exporter[195363]: ERROR   20:13:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:13:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:13:04 compute-0 nova_compute[183177]: 2026-01-26 20:13:04.260 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:04 compute-0 nova_compute[183177]: 2026-01-26 20:13:04.815 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:07 compute-0 nova_compute[183177]: 2026-01-26 20:13:07.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:13:07 compute-0 nova_compute[183177]: 2026-01-26 20:13:07.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:13:07 compute-0 nova_compute[183177]: 2026-01-26 20:13:07.155 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 20:13:07 compute-0 nova_compute[183177]: 2026-01-26 20:13:07.663 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 20:13:08 compute-0 nova_compute[183177]: 2026-01-26 20:13:08.298 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:13:08 compute-0 nova_compute[183177]: 2026-01-26 20:13:08.299 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:13:08 compute-0 nova_compute[183177]: 2026-01-26 20:13:08.299 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "78b8c97e-5e87-4f6c-9031-926c30492876-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:13:08 compute-0 nova_compute[183177]: 2026-01-26 20:13:08.810 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:13:08 compute-0 nova_compute[183177]: 2026-01-26 20:13:08.810 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:13:08 compute-0 nova_compute[183177]: 2026-01-26 20:13:08.810 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:13:08 compute-0 nova_compute[183177]: 2026-01-26 20:13:08.811 183181 DEBUG nova.compute.resource_tracker [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:13:08 compute-0 nova_compute[183177]: 2026-01-26 20:13:08.979 183181 WARNING nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:13:08 compute-0 nova_compute[183177]: 2026-01-26 20:13:08.980 183181 DEBUG oslo_concurrency.processutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:13:08 compute-0 nova_compute[183177]: 2026-01-26 20:13:08.998 183181 DEBUG oslo_concurrency.processutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:13:08 compute-0 nova_compute[183177]: 2026-01-26 20:13:08.999 183181 DEBUG nova.compute.resource_tracker [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5726MB free_disk=73.08970642089844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:13:08 compute-0 nova_compute[183177]: 2026-01-26 20:13:08.999 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:13:09 compute-0 nova_compute[183177]: 2026-01-26 20:13:08.999 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:13:09 compute-0 nova_compute[183177]: 2026-01-26 20:13:09.262 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:09 compute-0 nova_compute[183177]: 2026-01-26 20:13:09.850 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:10 compute-0 nova_compute[183177]: 2026-01-26 20:13:10.022 183181 DEBUG nova.compute.resource_tracker [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration for instance 78b8c97e-5e87-4f6c-9031-926c30492876 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Jan 26 20:13:10 compute-0 nova_compute[183177]: 2026-01-26 20:13:10.529 183181 DEBUG nova.compute.resource_tracker [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Jan 26 20:13:10 compute-0 nova_compute[183177]: 2026-01-26 20:13:10.570 183181 DEBUG nova.compute.resource_tracker [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Migration 895d5a21-7379-4c8e-b446-abe87e215269 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Jan 26 20:13:10 compute-0 nova_compute[183177]: 2026-01-26 20:13:10.570 183181 DEBUG nova.compute.resource_tracker [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:13:10 compute-0 nova_compute[183177]: 2026-01-26 20:13:10.570 183181 DEBUG nova.compute.resource_tracker [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:13:08 up  1:37,  0 user,  load average: 0.17, 0.23, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:13:10 compute-0 nova_compute[183177]: 2026-01-26 20:13:10.614 183181 DEBUG nova.compute.provider_tree [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:13:11 compute-0 nova_compute[183177]: 2026-01-26 20:13:11.121 183181 DEBUG nova.scheduler.client.report [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:13:11 compute-0 nova_compute[183177]: 2026-01-26 20:13:11.631 183181 DEBUG nova.compute.resource_tracker [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:13:11 compute-0 nova_compute[183177]: 2026-01-26 20:13:11.632 183181 DEBUG oslo_concurrency.lockutils [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.633s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:13:11 compute-0 nova_compute[183177]: 2026-01-26 20:13:11.659 183181 INFO nova.compute.manager [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 26 20:13:12 compute-0 nova_compute[183177]: 2026-01-26 20:13:12.728 183181 INFO nova.scheduler.client.report [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] Deleted allocation for migration 895d5a21-7379-4c8e-b446-abe87e215269
Jan 26 20:13:12 compute-0 nova_compute[183177]: 2026-01-26 20:13:12.729 183181 DEBUG nova.virt.libvirt.driver [None req-76c200c6-a374-4847-8445-af8ff503159c 4d24f5df1de9461fbbfa48ee5bd1727b 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: 78b8c97e-5e87-4f6c-9031-926c30492876] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Jan 26 20:13:13 compute-0 nova_compute[183177]: 2026-01-26 20:13:13.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:13:13 compute-0 nova_compute[183177]: 2026-01-26 20:13:13.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 20:13:14 compute-0 nova_compute[183177]: 2026-01-26 20:13:14.263 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:14 compute-0 nova_compute[183177]: 2026-01-26 20:13:14.661 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:13:14 compute-0 nova_compute[183177]: 2026-01-26 20:13:14.886 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:19 compute-0 nova_compute[183177]: 2026-01-26 20:13:19.150 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:13:19 compute-0 nova_compute[183177]: 2026-01-26 20:13:19.265 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:19 compute-0 nova_compute[183177]: 2026-01-26 20:13:19.888 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:13:21 compute-0 sshd-session[217076]: Connection closed by authenticating user root 142.93.140.142 port 42544 [preauth]
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.676 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.677 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.678 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.678 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.903 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.905 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.938 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.939 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5730MB free_disk=73.08970642089844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.940 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:13:21 compute-0 nova_compute[183177]: 2026-01-26 20:13:21.940 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:13:22 compute-0 podman[217080]: 2026-01-26 20:13:22.357315144 +0000 UTC m=+0.092177115 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Jan 26 20:13:22 compute-0 podman[217081]: 2026-01-26 20:13:22.357811998 +0000 UTC m=+0.086466502 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:13:22 compute-0 podman[217079]: 2026-01-26 20:13:22.417912267 +0000 UTC m=+0.163918898 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 26 20:13:23 compute-0 nova_compute[183177]: 2026-01-26 20:13:23.009 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:13:23 compute-0 nova_compute[183177]: 2026-01-26 20:13:23.010 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:13:21 up  1:37,  0 user,  load average: 0.15, 0.22, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:13:23 compute-0 nova_compute[183177]: 2026-01-26 20:13:23.039 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:13:23 compute-0 nova_compute[183177]: 2026-01-26 20:13:23.547 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:13:24 compute-0 nova_compute[183177]: 2026-01-26 20:13:24.064 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:13:24 compute-0 nova_compute[183177]: 2026-01-26 20:13:24.064 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.124s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:13:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:13:24.113 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:13:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:13:24.113 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:13:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:13:24.113 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:13:24 compute-0 nova_compute[183177]: 2026-01-26 20:13:24.267 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:24 compute-0 nova_compute[183177]: 2026-01-26 20:13:24.891 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:25 compute-0 nova_compute[183177]: 2026-01-26 20:13:25.065 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:13:28 compute-0 nova_compute[183177]: 2026-01-26 20:13:28.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:13:28 compute-0 podman[217142]: 2026-01-26 20:13:28.305970798 +0000 UTC m=+0.059883115 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:13:29 compute-0 nova_compute[183177]: 2026-01-26 20:13:29.270 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:29 compute-0 podman[192499]: time="2026-01-26T20:13:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:13:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:13:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:13:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:13:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Jan 26 20:13:29 compute-0 nova_compute[183177]: 2026-01-26 20:13:29.892 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:13:30.903 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:13:30 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:13:30.904 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:13:30 compute-0 nova_compute[183177]: 2026-01-26 20:13:30.906 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:31 compute-0 openstack_network_exporter[195363]: ERROR   20:13:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:13:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:13:31 compute-0 openstack_network_exporter[195363]: ERROR   20:13:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:13:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:13:34 compute-0 nova_compute[183177]: 2026-01-26 20:13:34.271 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:34 compute-0 nova_compute[183177]: 2026-01-26 20:13:34.927 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:35 compute-0 nova_compute[183177]: 2026-01-26 20:13:35.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:13:39 compute-0 nova_compute[183177]: 2026-01-26 20:13:39.065 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:39 compute-0 nova_compute[183177]: 2026-01-26 20:13:39.274 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:39 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:13:39.905 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:13:39 compute-0 nova_compute[183177]: 2026-01-26 20:13:39.929 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:40 compute-0 sshd-session[217167]: Connection closed by authenticating user root 188.166.116.149 port 36252 [preauth]
Jan 26 20:13:44 compute-0 nova_compute[183177]: 2026-01-26 20:13:44.275 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:44 compute-0 nova_compute[183177]: 2026-01-26 20:13:44.932 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:49 compute-0 nova_compute[183177]: 2026-01-26 20:13:49.277 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:49 compute-0 nova_compute[183177]: 2026-01-26 20:13:49.969 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:51 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:13:51.840 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:e5:bd 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b60a7fac-0e46-41c4-b058-76398a3bda4c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dec47b42a941889a2d13d95aaab372', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af7846ee-bb5f-4892-9e05-ffc63360c016, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=26607258-3960-4738-bede-d8d21b428bff) old=Port_Binding(mac=['fa:16:3e:17:e5:bd'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b60a7fac-0e46-41c4-b058-76398a3bda4c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dec47b42a941889a2d13d95aaab372', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:13:51 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:13:51.842 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 26607258-3960-4738-bede-d8d21b428bff in datapath b60a7fac-0e46-41c4-b058-76398a3bda4c updated
Jan 26 20:13:51 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:13:51.842 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b60a7fac-0e46-41c4-b058-76398a3bda4c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 20:13:51 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:13:51.844 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e4027597-6123-45ae-ba94-4c6a942cc06b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:13:53 compute-0 podman[217172]: 2026-01-26 20:13:53.360182094 +0000 UTC m=+0.086634726 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 20:13:53 compute-0 podman[217171]: 2026-01-26 20:13:53.361263214 +0000 UTC m=+0.093925173 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Jan 26 20:13:53 compute-0 podman[217170]: 2026-01-26 20:13:53.407496 +0000 UTC m=+0.146394317 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 20:13:54 compute-0 nova_compute[183177]: 2026-01-26 20:13:54.280 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:54 compute-0 nova_compute[183177]: 2026-01-26 20:13:54.969 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:57 compute-0 sshd-session[217231]: Connection closed by authenticating user root 142.93.140.142 port 45464 [preauth]
Jan 26 20:13:59 compute-0 nova_compute[183177]: 2026-01-26 20:13:59.281 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:13:59 compute-0 podman[217233]: 2026-01-26 20:13:59.336529305 +0000 UTC m=+0.077461308 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 20:13:59 compute-0 podman[192499]: time="2026-01-26T20:13:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:13:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:13:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:13:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:13:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2185 "" "Go-http-client/1.1"
Jan 26 20:14:00 compute-0 nova_compute[183177]: 2026-01-26 20:14:00.010 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:00 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:00.744 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:6b:d5 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-516e66c0-993e-4291-8bac-9b619ea85bb2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-516e66c0-993e-4291-8bac-9b619ea85bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38ae655879f546f9b658b4587909e2ba', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85afe72c-e974-450f-a1d5-b8ccc7b5b1cf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f042d3ca-2ade-498c-8aa7-9373c77eba8b) old=Port_Binding(mac=['fa:16:3e:ac:6b:d5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-516e66c0-993e-4291-8bac-9b619ea85bb2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-516e66c0-993e-4291-8bac-9b619ea85bb2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38ae655879f546f9b658b4587909e2ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:14:00 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:00.745 104672 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f042d3ca-2ade-498c-8aa7-9373c77eba8b in datapath 516e66c0-993e-4291-8bac-9b619ea85bb2 updated
Jan 26 20:14:00 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:00.746 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 516e66c0-993e-4291-8bac-9b619ea85bb2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 20:14:00 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:00.747 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[611478fa-cbe3-44f7-ba65-07cc73a8da76]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:01 compute-0 openstack_network_exporter[195363]: ERROR   20:14:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:14:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:14:01 compute-0 openstack_network_exporter[195363]: ERROR   20:14:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:14:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:14:04 compute-0 nova_compute[183177]: 2026-01-26 20:14:04.283 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:05 compute-0 nova_compute[183177]: 2026-01-26 20:14:05.013 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:07 compute-0 nova_compute[183177]: 2026-01-26 20:14:07.659 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:14:09 compute-0 nova_compute[183177]: 2026-01-26 20:14:09.316 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:10 compute-0 nova_compute[183177]: 2026-01-26 20:14:10.015 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:13 compute-0 ovn_controller[95396]: 2026-01-26T20:14:13Z|00250|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 26 20:14:14 compute-0 nova_compute[183177]: 2026-01-26 20:14:14.352 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:14 compute-0 nova_compute[183177]: 2026-01-26 20:14:14.615 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:14 compute-0 nova_compute[183177]: 2026-01-26 20:14:14.616 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:15 compute-0 nova_compute[183177]: 2026-01-26 20:14:15.016 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:15 compute-0 nova_compute[183177]: 2026-01-26 20:14:15.121 183181 DEBUG nova.compute.manager [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 26 20:14:15 compute-0 nova_compute[183177]: 2026-01-26 20:14:15.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:14:15 compute-0 nova_compute[183177]: 2026-01-26 20:14:15.676 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:15 compute-0 nova_compute[183177]: 2026-01-26 20:14:15.676 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:15 compute-0 nova_compute[183177]: 2026-01-26 20:14:15.685 183181 DEBUG nova.virt.hardware [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 26 20:14:15 compute-0 nova_compute[183177]: 2026-01-26 20:14:15.685 183181 INFO nova.compute.claims [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Claim successful on node compute-0.ctlplane.example.com
Jan 26 20:14:16 compute-0 nova_compute[183177]: 2026-01-26 20:14:16.754 183181 DEBUG nova.compute.provider_tree [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:14:17 compute-0 nova_compute[183177]: 2026-01-26 20:14:17.265 183181 DEBUG nova.scheduler.client.report [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:14:17 compute-0 nova_compute[183177]: 2026-01-26 20:14:17.776 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.100s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:17 compute-0 nova_compute[183177]: 2026-01-26 20:14:17.777 183181 DEBUG nova.compute.manager [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 26 20:14:18 compute-0 nova_compute[183177]: 2026-01-26 20:14:18.289 183181 DEBUG nova.compute.manager [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 26 20:14:18 compute-0 nova_compute[183177]: 2026-01-26 20:14:18.290 183181 DEBUG nova.network.neutron [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 26 20:14:18 compute-0 nova_compute[183177]: 2026-01-26 20:14:18.290 183181 WARNING neutronclient.v2_0.client [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:14:18 compute-0 nova_compute[183177]: 2026-01-26 20:14:18.290 183181 WARNING neutronclient.v2_0.client [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:14:18 compute-0 nova_compute[183177]: 2026-01-26 20:14:18.797 183181 INFO nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 26 20:14:19 compute-0 nova_compute[183177]: 2026-01-26 20:14:19.135 183181 DEBUG nova.network.neutron [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Successfully created port: 81e834db-ccbf-445f-a30e-0d88556aa09b _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 26 20:14:19 compute-0 nova_compute[183177]: 2026-01-26 20:14:19.309 183181 DEBUG nova.compute.manager [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 26 20:14:19 compute-0 nova_compute[183177]: 2026-01-26 20:14:19.354 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.018 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.281 183181 DEBUG nova.network.neutron [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Successfully updated port: 81e834db-ccbf-445f-a30e-0d88556aa09b _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.335 183181 DEBUG nova.compute.manager [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.336 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.336 183181 INFO nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Creating image(s)
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.337 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "/var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.337 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "/var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.338 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "/var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.338 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.341 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.342 183181 DEBUG oslo_concurrency.processutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.357 183181 DEBUG nova.compute.manager [req-01775b2d-ef40-4f1d-a309-6a2077241a57 req-8d55c392-9f00-4946-9126-eec757576e74 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Received event network-changed-81e834db-ccbf-445f-a30e-0d88556aa09b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.358 183181 DEBUG nova.compute.manager [req-01775b2d-ef40-4f1d-a309-6a2077241a57 req-8d55c392-9f00-4946-9126-eec757576e74 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Refreshing instance network info cache due to event network-changed-81e834db-ccbf-445f-a30e-0d88556aa09b. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.358 183181 DEBUG oslo_concurrency.lockutils [req-01775b2d-ef40-4f1d-a309-6a2077241a57 req-8d55c392-9f00-4946-9126-eec757576e74 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "refresh_cache-a43f0d21-b3f4-43af-8f64-ef721299e79a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.358 183181 DEBUG oslo_concurrency.lockutils [req-01775b2d-ef40-4f1d-a309-6a2077241a57 req-8d55c392-9f00-4946-9126-eec757576e74 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquired lock "refresh_cache-a43f0d21-b3f4-43af-8f64-ef721299e79a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.359 183181 DEBUG nova.network.neutron [req-01775b2d-ef40-4f1d-a309-6a2077241a57 req-8d55c392-9f00-4946-9126-eec757576e74 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Refreshing network info cache for port 81e834db-ccbf-445f-a30e-0d88556aa09b _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.429 183181 DEBUG oslo_concurrency.processutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.430 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.430 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.431 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.434 183181 DEBUG oslo_utils.imageutils.format_inspector [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.435 183181 DEBUG oslo_concurrency.processutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.496 183181 DEBUG oslo_concurrency.processutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.497 183181 DEBUG oslo_concurrency.processutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.549 183181 DEBUG oslo_concurrency.processutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434,backing_fmt=raw /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.550 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "46c3eb6c81594e4d1392b4ffb0ccf21e15333434" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.550 183181 DEBUG oslo_concurrency.processutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.622 183181 DEBUG oslo_concurrency.processutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/46c3eb6c81594e4d1392b4ffb0ccf21e15333434 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.623 183181 DEBUG nova.virt.disk.api [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Checking if we can resize image /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.624 183181 DEBUG oslo_concurrency.processutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.682 183181 DEBUG oslo_concurrency.processutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.683 183181 DEBUG nova.virt.disk.api [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Cannot resize image /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.684 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.684 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Ensure instance console log exists: /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.685 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.685 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.685 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.788 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "refresh_cache-a43f0d21-b3f4-43af-8f64-ef721299e79a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 26 20:14:20 compute-0 nova_compute[183177]: 2026-01-26 20:14:20.865 183181 WARNING neutronclient.v2_0.client [req-01775b2d-ef40-4f1d-a309-6a2077241a57 req-8d55c392-9f00-4946-9126-eec757576e74 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:14:21 compute-0 sshd-session[217267]: Invalid user oracle from 193.32.162.151 port 48724
Jan 26 20:14:21 compute-0 sshd-session[217267]: Connection closed by invalid user oracle 193.32.162.151 port 48724 [preauth]
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.670 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.671 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.672 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.672 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.712 183181 DEBUG nova.network.neutron [req-01775b2d-ef40-4f1d-a309-6a2077241a57 req-8d55c392-9f00-4946-9126-eec757576e74 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.855 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.858 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.879 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.880 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5746MB free_disk=73.08954620361328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.881 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:21 compute-0 nova_compute[183177]: 2026-01-26 20:14:21.882 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:22 compute-0 nova_compute[183177]: 2026-01-26 20:14:22.450 183181 DEBUG nova.network.neutron [req-01775b2d-ef40-4f1d-a309-6a2077241a57 req-8d55c392-9f00-4946-9126-eec757576e74 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:14:22 compute-0 nova_compute[183177]: 2026-01-26 20:14:22.957 183181 DEBUG oslo_concurrency.lockutils [req-01775b2d-ef40-4f1d-a309-6a2077241a57 req-8d55c392-9f00-4946-9126-eec757576e74 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Releasing lock "refresh_cache-a43f0d21-b3f4-43af-8f64-ef721299e79a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:14:22 compute-0 nova_compute[183177]: 2026-01-26 20:14:22.958 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquired lock "refresh_cache-a43f0d21-b3f4-43af-8f64-ef721299e79a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 26 20:14:22 compute-0 nova_compute[183177]: 2026-01-26 20:14:22.958 183181 DEBUG nova.network.neutron [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 26 20:14:23 compute-0 nova_compute[183177]: 2026-01-26 20:14:23.023 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance a43f0d21-b3f4-43af-8f64-ef721299e79a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 20:14:23 compute-0 nova_compute[183177]: 2026-01-26 20:14:23.024 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:14:23 compute-0 nova_compute[183177]: 2026-01-26 20:14:23.024 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:14:21 up  1:38,  0 user,  load average: 0.15, 0.22, 0.25\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_38ae655879f546f9b658b4587909e2ba': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:14:23 compute-0 nova_compute[183177]: 2026-01-26 20:14:23.149 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:14:23 compute-0 sshd-session[217275]: Connection closed by authenticating user root 188.166.116.149 port 41520 [preauth]
Jan 26 20:14:23 compute-0 nova_compute[183177]: 2026-01-26 20:14:23.659 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:14:23 compute-0 nova_compute[183177]: 2026-01-26 20:14:23.690 183181 DEBUG nova.network.neutron [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 26 20:14:24 compute-0 nova_compute[183177]: 2026-01-26 20:14:24.035 183181 WARNING neutronclient.v2_0.client [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:14:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:24.114 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:24.115 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:24.115 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:24 compute-0 nova_compute[183177]: 2026-01-26 20:14:24.167 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:14:24 compute-0 nova_compute[183177]: 2026-01-26 20:14:24.168 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.286s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:24 compute-0 podman[217279]: 2026-01-26 20:14:24.327856419 +0000 UTC m=+0.064503371 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Jan 26 20:14:24 compute-0 nova_compute[183177]: 2026-01-26 20:14:24.356 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:24 compute-0 podman[217280]: 2026-01-26 20:14:24.356397277 +0000 UTC m=+0.078577838 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120)
Jan 26 20:14:24 compute-0 podman[217278]: 2026-01-26 20:14:24.372003508 +0000 UTC m=+0.110206372 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 20:14:24 compute-0 nova_compute[183177]: 2026-01-26 20:14:24.698 183181 DEBUG nova.network.neutron [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Updating instance_info_cache with network_info: [{"id": "81e834db-ccbf-445f-a30e-0d88556aa09b", "address": "fa:16:3e:3a:f8:93", "network": {"id": "b60a7fac-0e46-41c4-b058-76398a3bda4c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-214112467-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62dec47b42a941889a2d13d95aaab372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81e834db-cc", "ovs_interfaceid": "81e834db-ccbf-445f-a30e-0d88556aa09b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.020 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.169 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.170 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.206 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Releasing lock "refresh_cache-a43f0d21-b3f4-43af-8f64-ef721299e79a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.207 183181 DEBUG nova.compute.manager [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Instance network_info: |[{"id": "81e834db-ccbf-445f-a30e-0d88556aa09b", "address": "fa:16:3e:3a:f8:93", "network": {"id": "b60a7fac-0e46-41c4-b058-76398a3bda4c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-214112467-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62dec47b42a941889a2d13d95aaab372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81e834db-cc", "ovs_interfaceid": "81e834db-ccbf-445f-a30e-0d88556aa09b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.211 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Start _get_guest_xml network_info=[{"id": "81e834db-ccbf-445f-a30e-0d88556aa09b", "address": "fa:16:3e:3a:f8:93", "network": {"id": "b60a7fac-0e46-41c4-b058-76398a3bda4c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-214112467-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62dec47b42a941889a2d13d95aaab372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81e834db-cc", "ovs_interfaceid": "81e834db-ccbf-445f-a30e-0d88556aa09b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'image_id': '34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.217 183181 WARNING nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.219 183181 DEBUG nova.virt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategyVolume-server-265117477', uuid='a43f0d21-b3f4-43af-8f64-ef721299e79a'), owner=OwnerMeta(userid='8e2da5d9b53344dbb3c17e8fe5cc8502', username='tempest-TestExecuteZoneMigrationStrategyVolume-285404190-project-admin', projectid='38ae655879f546f9b658b4587909e2ba', projectname='tempest-TestExecuteZoneMigrationStrategyVolume-285404190'), image=ImageMeta(id='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='78406e00-3362-4102-beb8-369c301866f3', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "81e834db-ccbf-445f-a30e-0d88556aa09b", "address": "fa:16:3e:3a:f8:93", "network": {"id": "b60a7fac-0e46-41c4-b058-76398a3bda4c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-214112467-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62dec47b42a941889a2d13d95aaab372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81e834db-cc", "ovs_interfaceid": "81e834db-ccbf-445f-a30e-0d88556aa09b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769458465.219014) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.224 183181 DEBUG nova.virt.libvirt.host [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.225 183181 DEBUG nova.virt.libvirt.host [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.237 183181 DEBUG nova.virt.libvirt.host [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.238 183181 DEBUG nova.virt.libvirt.host [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.240 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.241 183181 DEBUG nova.virt.hardware [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-26T19:32:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='78406e00-3362-4102-beb8-369c301866f3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-26T19:32:55Z,direct_url=<?>,disk_format='qcow2',id=34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='bbc26b645f2a4d108c00608f11fdebb2',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-26T19:32:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.242 183181 DEBUG nova.virt.hardware [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.242 183181 DEBUG nova.virt.hardware [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.242 183181 DEBUG nova.virt.hardware [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.243 183181 DEBUG nova.virt.hardware [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.243 183181 DEBUG nova.virt.hardware [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.244 183181 DEBUG nova.virt.hardware [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.244 183181 DEBUG nova.virt.hardware [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.245 183181 DEBUG nova.virt.hardware [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.245 183181 DEBUG nova.virt.hardware [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.246 183181 DEBUG nova.virt.hardware [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.254 183181 DEBUG nova.virt.libvirt.vif [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:14:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-265117477',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-265117477',id=34,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38ae655879f546f9b658b4587909e2ba',ramdisk_id='',reservation_id='r-yf4nwf8w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-285404190',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-285404190-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:14:19Z,user_data=None,user_id='8e2da5d9b53344dbb3c17e8fe5cc8502',uuid=a43f0d21-b3f4-43af-8f64-ef721299e79a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81e834db-ccbf-445f-a30e-0d88556aa09b", "address": "fa:16:3e:3a:f8:93", "network": {"id": "b60a7fac-0e46-41c4-b058-76398a3bda4c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-214112467-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62dec47b42a941889a2d13d95aaab372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81e834db-cc", "ovs_interfaceid": "81e834db-ccbf-445f-a30e-0d88556aa09b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.254 183181 DEBUG nova.network.os_vif_util [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Converting VIF {"id": "81e834db-ccbf-445f-a30e-0d88556aa09b", "address": "fa:16:3e:3a:f8:93", "network": {"id": "b60a7fac-0e46-41c4-b058-76398a3bda4c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-214112467-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62dec47b42a941889a2d13d95aaab372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81e834db-cc", "ovs_interfaceid": "81e834db-ccbf-445f-a30e-0d88556aa09b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.256 183181 DEBUG nova.network.os_vif_util [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:f8:93,bridge_name='br-int',has_traffic_filtering=True,id=81e834db-ccbf-445f-a30e-0d88556aa09b,network=Network(b60a7fac-0e46-41c4-b058-76398a3bda4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81e834db-cc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.258 183181 DEBUG nova.objects.instance [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lazy-loading 'pci_devices' on Instance uuid a43f0d21-b3f4-43af-8f64-ef721299e79a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.776 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] End _get_guest_xml xml=<domain type="kvm">
Jan 26 20:14:25 compute-0 nova_compute[183177]:   <uuid>a43f0d21-b3f4-43af-8f64-ef721299e79a</uuid>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   <name>instance-00000022</name>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   <memory>131072</memory>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   <vcpu>1</vcpu>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   <metadata>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <nova:name>tempest-TestExecuteZoneMigrationStrategyVolume-server-265117477</nova:name>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <nova:creationTime>2026-01-26 20:14:25</nova:creationTime>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <nova:flavor name="m1.nano" id="78406e00-3362-4102-beb8-369c301866f3">
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:memory>128</nova:memory>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:disk>1</nova:disk>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:swap>0</nova:swap>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:ephemeral>0</nova:ephemeral>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:vcpus>1</nova:vcpus>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:extraSpecs>
Jan 26 20:14:25 compute-0 nova_compute[183177]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         </nova:extraSpecs>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       </nova:flavor>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <nova:image uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5">
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:minDisk>1</nova:minDisk>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:minRam>0</nova:minRam>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:properties>
Jan 26 20:14:25 compute-0 nova_compute[183177]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         </nova:properties>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       </nova:image>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <nova:owner>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:user uuid="8e2da5d9b53344dbb3c17e8fe5cc8502">tempest-TestExecuteZoneMigrationStrategyVolume-285404190-project-admin</nova:user>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:project uuid="38ae655879f546f9b658b4587909e2ba">tempest-TestExecuteZoneMigrationStrategyVolume-285404190</nova:project>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       </nova:owner>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <nova:root type="image" uuid="34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <nova:ports>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         <nova:port uuid="81e834db-ccbf-445f-a30e-0d88556aa09b">
Jan 26 20:14:25 compute-0 nova_compute[183177]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:         </nova:port>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       </nova:ports>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     </nova:instance>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   </metadata>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   <sysinfo type="smbios">
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <system>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <entry name="manufacturer">RDO</entry>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <entry name="product">OpenStack Compute</entry>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <entry name="serial">a43f0d21-b3f4-43af-8f64-ef721299e79a</entry>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <entry name="uuid">a43f0d21-b3f4-43af-8f64-ef721299e79a</entry>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <entry name="family">Virtual Machine</entry>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     </system>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   </sysinfo>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   <os>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <boot dev="hd"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <smbios mode="sysinfo"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   </os>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   <features>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <acpi/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <apic/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <vmcoreinfo/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   </features>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   <clock offset="utc">
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <timer name="pit" tickpolicy="delay"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <timer name="hpet" present="no"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   </clock>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   <cpu mode="custom" match="exact">
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <model>Nehalem</model>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <topology sockets="1" cores="1" threads="1"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   </cpu>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   <devices>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <disk type="file" device="disk">
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <target dev="vda" bus="virtio"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <disk type="file" device="cdrom">
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <driver name="qemu" type="raw" cache="none"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <source file="/var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk.config"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <target dev="sda" bus="sata"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     </disk>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <interface type="ethernet">
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <mac address="fa:16:3e:3a:f8:93"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <driver name="vhost" rx_queue_size="512"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <mtu size="1442"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <target dev="tap81e834db-cc"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     </interface>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <serial type="pty">
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <log file="/var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/console.log" append="off"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     </serial>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <video>
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <model type="virtio"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     </video>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <input type="tablet" bus="usb"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <rng model="virtio">
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <backend model="random">/dev/urandom</backend>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     </rng>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="pci" model="pcie-root-port"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <controller type="usb" index="0"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 26 20:14:25 compute-0 nova_compute[183177]:       <stats period="10"/>
Jan 26 20:14:25 compute-0 nova_compute[183177]:     </memballoon>
Jan 26 20:14:25 compute-0 nova_compute[183177]:   </devices>
Jan 26 20:14:25 compute-0 nova_compute[183177]: </domain>
Jan 26 20:14:25 compute-0 nova_compute[183177]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.777 183181 DEBUG nova.compute.manager [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Preparing to wait for external event network-vif-plugged-81e834db-ccbf-445f-a30e-0d88556aa09b prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.778 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.778 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.779 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.780 183181 DEBUG nova.virt.libvirt.vif [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2026-01-26T20:14:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-265117477',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-265117477',id=34,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38ae655879f546f9b658b4587909e2ba',ramdisk_id='',reservation_id='r-yf4nwf8w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-285404190',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-285404190-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-26T20:14:19Z,user_data=None,user_id='8e2da5d9b53344dbb3c17e8fe5cc8502',uuid=a43f0d21-b3f4-43af-8f64-ef721299e79a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81e834db-ccbf-445f-a30e-0d88556aa09b", "address": "fa:16:3e:3a:f8:93", "network": {"id": "b60a7fac-0e46-41c4-b058-76398a3bda4c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-214112467-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62dec47b42a941889a2d13d95aaab372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81e834db-cc", "ovs_interfaceid": "81e834db-ccbf-445f-a30e-0d88556aa09b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.780 183181 DEBUG nova.network.os_vif_util [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Converting VIF {"id": "81e834db-ccbf-445f-a30e-0d88556aa09b", "address": "fa:16:3e:3a:f8:93", "network": {"id": "b60a7fac-0e46-41c4-b058-76398a3bda4c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-214112467-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62dec47b42a941889a2d13d95aaab372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81e834db-cc", "ovs_interfaceid": "81e834db-ccbf-445f-a30e-0d88556aa09b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.781 183181 DEBUG nova.network.os_vif_util [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:f8:93,bridge_name='br-int',has_traffic_filtering=True,id=81e834db-ccbf-445f-a30e-0d88556aa09b,network=Network(b60a7fac-0e46-41c4-b058-76398a3bda4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81e834db-cc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.782 183181 DEBUG os_vif [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:f8:93,bridge_name='br-int',has_traffic_filtering=True,id=81e834db-ccbf-445f-a30e-0d88556aa09b,network=Network(b60a7fac-0e46-41c4-b058-76398a3bda4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81e834db-cc') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.782 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.783 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.783 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.784 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.785 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '399373c3-69c1-598b-a187-725c53169f45', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.786 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.788 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.791 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.792 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81e834db-cc, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.792 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap81e834db-cc, col_values=(('qos', UUID('4bc3f14f-a699-4162-888d-9a6e35d3c448')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.793 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap81e834db-cc, col_values=(('external_ids', {'iface-id': '81e834db-ccbf-445f-a30e-0d88556aa09b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3a:f8:93', 'vm-uuid': 'a43f0d21-b3f4-43af-8f64-ef721299e79a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.794 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:25 compute-0 NetworkManager[55489]: <info>  [1769458465.7963] manager: (tap81e834db-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.797 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.803 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:25 compute-0 nova_compute[183177]: 2026-01-26 20:14:25.804 183181 INFO os_vif [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:f8:93,bridge_name='br-int',has_traffic_filtering=True,id=81e834db-ccbf-445f-a30e-0d88556aa09b,network=Network(b60a7fac-0e46-41c4-b058-76398a3bda4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81e834db-cc')
Jan 26 20:14:27 compute-0 nova_compute[183177]: 2026-01-26 20:14:27.374 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:14:27 compute-0 nova_compute[183177]: 2026-01-26 20:14:27.375 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:14:27 compute-0 nova_compute[183177]: 2026-01-26 20:14:27.375 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] No VIF found with MAC fa:16:3e:3a:f8:93, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 20:14:27 compute-0 nova_compute[183177]: 2026-01-26 20:14:27.376 183181 INFO nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Using config drive
Jan 26 20:14:27 compute-0 nova_compute[183177]: 2026-01-26 20:14:27.891 183181 WARNING neutronclient.v2_0.client [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:14:28 compute-0 nova_compute[183177]: 2026-01-26 20:14:28.716 183181 INFO nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Creating config drive at /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk.config
Jan 26 20:14:28 compute-0 nova_compute[183177]: 2026-01-26 20:14:28.722 183181 DEBUG oslo_concurrency.processutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpip8kwd3h execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:14:28 compute-0 nova_compute[183177]: 2026-01-26 20:14:28.852 183181 DEBUG oslo_concurrency.processutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpip8kwd3h" returned: 0 in 0.130s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:14:28 compute-0 kernel: tap81e834db-cc: entered promiscuous mode
Jan 26 20:14:28 compute-0 NetworkManager[55489]: <info>  [1769458468.9336] manager: (tap81e834db-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Jan 26 20:14:28 compute-0 ovn_controller[95396]: 2026-01-26T20:14:28Z|00251|binding|INFO|Claiming lport 81e834db-ccbf-445f-a30e-0d88556aa09b for this chassis.
Jan 26 20:14:28 compute-0 ovn_controller[95396]: 2026-01-26T20:14:28Z|00252|binding|INFO|81e834db-ccbf-445f-a30e-0d88556aa09b: Claiming fa:16:3e:3a:f8:93 10.100.0.8
Jan 26 20:14:28 compute-0 nova_compute[183177]: 2026-01-26 20:14:28.934 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:28 compute-0 nova_compute[183177]: 2026-01-26 20:14:28.941 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:28.955 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:f8:93 10.100.0.8'], port_security=['fa:16:3e:3a:f8:93 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a43f0d21-b3f4-43af-8f64-ef721299e79a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b60a7fac-0e46-41c4-b058-76398a3bda4c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38ae655879f546f9b658b4587909e2ba', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7e3ab412-bd7e-4a46-92e3-b4c17a5c712a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af7846ee-bb5f-4892-9e05-ffc63360c016, chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=81e834db-ccbf-445f-a30e-0d88556aa09b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:14:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:28.956 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 81e834db-ccbf-445f-a30e-0d88556aa09b in datapath b60a7fac-0e46-41c4-b058-76398a3bda4c bound to our chassis
Jan 26 20:14:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:28.958 104672 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b60a7fac-0e46-41c4-b058-76398a3bda4c
Jan 26 20:14:28 compute-0 systemd-udevd[217363]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:14:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:28.980 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[7a489cb3-22a1-40eb-95bc-13f50647303d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:28.980 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb60a7fac-01 in ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 26 20:14:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:28.982 203984 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb60a7fac-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 26 20:14:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:28.982 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e159dfa3-3bcb-4174-9357-ea2db08b8731]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:28.983 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[471c88d0-f209-4863-99ea-e7baf59f9bf9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:28 compute-0 systemd-machined[154465]: New machine qemu-25-instance-00000022.
Jan 26 20:14:28 compute-0 NetworkManager[55489]: <info>  [1769458468.9989] device (tap81e834db-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 26 20:14:28 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:28.997 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[65381a19-08e0-4792-b099-f7debea982dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:28 compute-0 NetworkManager[55489]: <info>  [1769458468.9995] device (tap81e834db-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 26 20:14:29 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-00000022.
Jan 26 20:14:29 compute-0 nova_compute[183177]: 2026-01-26 20:14:29.010 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:29 compute-0 ovn_controller[95396]: 2026-01-26T20:14:29Z|00253|binding|INFO|Setting lport 81e834db-ccbf-445f-a30e-0d88556aa09b ovn-installed in OVS
Jan 26 20:14:29 compute-0 ovn_controller[95396]: 2026-01-26T20:14:29Z|00254|binding|INFO|Setting lport 81e834db-ccbf-445f-a30e-0d88556aa09b up in Southbound
Jan 26 20:14:29 compute-0 nova_compute[183177]: 2026-01-26 20:14:29.013 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.018 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[846965bf-9252-42ef-a12e-655c0ed89e41]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.062 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[dba701cb-d1db-4d75-85bb-5442f650656b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.068 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[b7119edb-ca32-4d83-b2d1-31f94c7ced8a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 NetworkManager[55489]: <info>  [1769458469.0699] manager: (tapb60a7fac-00): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Jan 26 20:14:29 compute-0 systemd-udevd[217368]: Network interface NamePolicy= disabled on kernel command line.
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.102 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[87b07f08-d862-451c-830c-10f3b825c0cd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.105 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[f8289ba8-dd0f-427d-a879-260442c5b79f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 NetworkManager[55489]: <info>  [1769458469.1365] device (tapb60a7fac-00): carrier: link connected
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.145 204720 DEBUG oslo.privsep.daemon [-] privsep: reply[df2c3085-8a5d-49ae-854a-32e3a7f1a622]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.169 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[c76b535d-4291-4ccc-a1f5-d9aee7f86616]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb60a7fac-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:e5:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592802, 'reachable_time': 35585, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217396, 'error': None, 'target': 'ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.194 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[f4ee9055-e944-4d9a-ad8f-e3eb0d38f08d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe17:e5bd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 592802, 'tstamp': 592802}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217397, 'error': None, 'target': 'ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.221 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[4a3b05fe-e29f-49cc-8819-a709082be310]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb60a7fac-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:e5:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592802, 'reachable_time': 35585, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217398, 'error': None, 'target': 'ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.265 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[e4c9c33d-9bb8-4f49-b727-9f45ce7b03b8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.362 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8d61b00c-6824-49a3-89f6-c5dc74be6822]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.363 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb60a7fac-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.363 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.363 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb60a7fac-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:14:29 compute-0 nova_compute[183177]: 2026-01-26 20:14:29.366 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:29 compute-0 kernel: tapb60a7fac-00: entered promiscuous mode
Jan 26 20:14:29 compute-0 NetworkManager[55489]: <info>  [1769458469.3670] manager: (tapb60a7fac-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Jan 26 20:14:29 compute-0 nova_compute[183177]: 2026-01-26 20:14:29.370 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.373 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb60a7fac-00, col_values=(('external_ids', {'iface-id': '26607258-3960-4738-bede-d8d21b428bff'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:14:29 compute-0 nova_compute[183177]: 2026-01-26 20:14:29.375 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:29 compute-0 ovn_controller[95396]: 2026-01-26T20:14:29Z|00255|binding|INFO|Releasing lport 26607258-3960-4738-bede-d8d21b428bff from this chassis (sb_readonly=0)
Jan 26 20:14:29 compute-0 nova_compute[183177]: 2026-01-26 20:14:29.398 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.401 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[a2a0953d-f855-4d37-a515-5eef128d78d9]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.402 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b60a7fac-0e46-41c4-b058-76398a3bda4c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b60a7fac-0e46-41c4-b058-76398a3bda4c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.402 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b60a7fac-0e46-41c4-b058-76398a3bda4c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b60a7fac-0e46-41c4-b058-76398a3bda4c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.403 104672 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for b60a7fac-0e46-41c4-b058-76398a3bda4c disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.403 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b60a7fac-0e46-41c4-b058-76398a3bda4c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b60a7fac-0e46-41c4-b058-76398a3bda4c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.404 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[7da5160d-5629-49b8-b25d-d601b3d4af98]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.404 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b60a7fac-0e46-41c4-b058-76398a3bda4c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b60a7fac-0e46-41c4-b058-76398a3bda4c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.405 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[89b29f4b-df7d-4bc5-9e1e-7968d4d64e80]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.405 104672 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: global
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     log         /dev/log local0 debug
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     log-tag     haproxy-metadata-proxy-b60a7fac-0e46-41c4-b058-76398a3bda4c
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     user        root
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     group       root
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     maxconn     1024
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     pidfile     /var/lib/neutron/external/pids/b60a7fac-0e46-41c4-b058-76398a3bda4c.pid.haproxy
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     daemon
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: defaults
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     log global
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     mode http
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     option httplog
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     option dontlognull
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     option http-server-close
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     option forwardfor
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     retries                 3
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     timeout http-request    30s
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     timeout connect         30s
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     timeout client          32s
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     timeout server          32s
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     timeout http-keep-alive 30s
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: listen listener
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     bind 169.254.169.254:80
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     server metadata /var/lib/neutron/metadata_proxy
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:     http-request add-header X-OVN-Network-ID b60a7fac-0e46-41c4-b058-76398a3bda4c
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 26 20:14:29 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:29.406 104672 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c', 'env', 'PROCESS_TAG=haproxy-b60a7fac-0e46-41c4-b058-76398a3bda4c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b60a7fac-0e46-41c4-b058-76398a3bda4c.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 26 20:14:29 compute-0 podman[192499]: time="2026-01-26T20:14:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:14:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:14:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:14:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:14:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2185 "" "Go-http-client/1.1"
Jan 26 20:14:29 compute-0 podman[217439]: 2026-01-26 20:14:29.812794325 +0000 UTC m=+0.029922957 image pull 805916cc46b30a16ace21909bd4f943d5937a6ab4f97ba967b6bb1d56ac5f3f4 38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 26 20:14:30 compute-0 nova_compute[183177]: 2026-01-26 20:14:30.069 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:30 compute-0 nova_compute[183177]: 2026-01-26 20:14:30.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:14:30 compute-0 podman[217452]: 2026-01-26 20:14:30.358511953 +0000 UTC m=+0.087612253 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 20:14:30 compute-0 nova_compute[183177]: 2026-01-26 20:14:30.796 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:30 compute-0 podman[217439]: 2026-01-26 20:14:30.815423388 +0000 UTC m=+1.032551950 container create 498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:14:30 compute-0 systemd[1]: Started libpod-conmon-498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c.scope.
Jan 26 20:14:30 compute-0 systemd[1]: Started libcrun container.
Jan 26 20:14:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1bf20d72c496a2aca67e0d43f4cc2110f49db6cbc90af15206d28c36e8f943b7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 26 20:14:30 compute-0 podman[217439]: 2026-01-26 20:14:30.928436703 +0000 UTC m=+1.145565355 container init 498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, tcib_managed=true)
Jan 26 20:14:30 compute-0 podman[217439]: 2026-01-26 20:14:30.939327157 +0000 UTC m=+1.156455689 container start 498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 26 20:14:30 compute-0 neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c[217478]: [NOTICE]   (217482) : New worker (217484) forked
Jan 26 20:14:30 compute-0 neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c[217478]: [NOTICE]   (217482) : Loading success.
Jan 26 20:14:31 compute-0 nova_compute[183177]: 2026-01-26 20:14:31.061 183181 DEBUG nova.compute.manager [req-87aefd1e-e6c4-4686-91cf-d49b25f083c6 req-ffd7b9d5-206f-4000-963b-09da7b286a9c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Received event network-vif-plugged-81e834db-ccbf-445f-a30e-0d88556aa09b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:14:31 compute-0 nova_compute[183177]: 2026-01-26 20:14:31.061 183181 DEBUG oslo_concurrency.lockutils [req-87aefd1e-e6c4-4686-91cf-d49b25f083c6 req-ffd7b9d5-206f-4000-963b-09da7b286a9c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:31 compute-0 nova_compute[183177]: 2026-01-26 20:14:31.062 183181 DEBUG oslo_concurrency.lockutils [req-87aefd1e-e6c4-4686-91cf-d49b25f083c6 req-ffd7b9d5-206f-4000-963b-09da7b286a9c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:31 compute-0 nova_compute[183177]: 2026-01-26 20:14:31.062 183181 DEBUG oslo_concurrency.lockutils [req-87aefd1e-e6c4-4686-91cf-d49b25f083c6 req-ffd7b9d5-206f-4000-963b-09da7b286a9c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:31 compute-0 nova_compute[183177]: 2026-01-26 20:14:31.062 183181 DEBUG nova.compute.manager [req-87aefd1e-e6c4-4686-91cf-d49b25f083c6 req-ffd7b9d5-206f-4000-963b-09da7b286a9c 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Processing event network-vif-plugged-81e834db-ccbf-445f-a30e-0d88556aa09b _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 26 20:14:31 compute-0 nova_compute[183177]: 2026-01-26 20:14:31.063 183181 DEBUG nova.compute.manager [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 26 20:14:31 compute-0 nova_compute[183177]: 2026-01-26 20:14:31.069 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 26 20:14:31 compute-0 nova_compute[183177]: 2026-01-26 20:14:31.073 183181 INFO nova.virt.libvirt.driver [-] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Instance spawned successfully.
Jan 26 20:14:31 compute-0 nova_compute[183177]: 2026-01-26 20:14:31.073 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 26 20:14:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:31.100 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:14:31 compute-0 nova_compute[183177]: 2026-01-26 20:14:31.100 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:31 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:31.103 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:14:32 compute-0 nova_compute[183177]: 2026-01-26 20:14:32.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:14:32 compute-0 openstack_network_exporter[195363]: ERROR   20:14:32 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:14:32 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:14:32 compute-0 openstack_network_exporter[195363]: ERROR   20:14:32 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:14:32 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:14:32 compute-0 nova_compute[183177]: 2026-01-26 20:14:32.160 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:14:32 compute-0 nova_compute[183177]: 2026-01-26 20:14:32.161 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:14:32 compute-0 nova_compute[183177]: 2026-01-26 20:14:32.161 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:14:32 compute-0 nova_compute[183177]: 2026-01-26 20:14:32.161 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:14:32 compute-0 nova_compute[183177]: 2026-01-26 20:14:32.162 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:14:32 compute-0 nova_compute[183177]: 2026-01-26 20:14:32.162 183181 DEBUG nova.virt.libvirt.driver [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 26 20:14:32 compute-0 nova_compute[183177]: 2026-01-26 20:14:32.671 183181 INFO nova.compute.manager [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Took 12.34 seconds to spawn the instance on the hypervisor.
Jan 26 20:14:32 compute-0 nova_compute[183177]: 2026-01-26 20:14:32.672 183181 DEBUG nova.compute.manager [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 26 20:14:33 compute-0 nova_compute[183177]: 2026-01-26 20:14:33.167 183181 DEBUG nova.compute.manager [req-cd909949-7f2a-43de-bd48-6502396315d7 req-23c8ddc3-0a1d-4b20-81dc-de1895a226cf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Received event network-vif-plugged-81e834db-ccbf-445f-a30e-0d88556aa09b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:14:33 compute-0 nova_compute[183177]: 2026-01-26 20:14:33.168 183181 DEBUG oslo_concurrency.lockutils [req-cd909949-7f2a-43de-bd48-6502396315d7 req-23c8ddc3-0a1d-4b20-81dc-de1895a226cf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:33 compute-0 nova_compute[183177]: 2026-01-26 20:14:33.169 183181 DEBUG oslo_concurrency.lockutils [req-cd909949-7f2a-43de-bd48-6502396315d7 req-23c8ddc3-0a1d-4b20-81dc-de1895a226cf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:33 compute-0 nova_compute[183177]: 2026-01-26 20:14:33.170 183181 DEBUG oslo_concurrency.lockutils [req-cd909949-7f2a-43de-bd48-6502396315d7 req-23c8ddc3-0a1d-4b20-81dc-de1895a226cf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:33 compute-0 nova_compute[183177]: 2026-01-26 20:14:33.170 183181 DEBUG nova.compute.manager [req-cd909949-7f2a-43de-bd48-6502396315d7 req-23c8ddc3-0a1d-4b20-81dc-de1895a226cf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] No waiting events found dispatching network-vif-plugged-81e834db-ccbf-445f-a30e-0d88556aa09b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:14:33 compute-0 nova_compute[183177]: 2026-01-26 20:14:33.171 183181 WARNING nova.compute.manager [req-cd909949-7f2a-43de-bd48-6502396315d7 req-23c8ddc3-0a1d-4b20-81dc-de1895a226cf 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Received unexpected event network-vif-plugged-81e834db-ccbf-445f-a30e-0d88556aa09b for instance with vm_state active and task_state None.
Jan 26 20:14:33 compute-0 nova_compute[183177]: 2026-01-26 20:14:33.214 183181 INFO nova.compute.manager [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Took 17.58 seconds to build instance.
Jan 26 20:14:33 compute-0 nova_compute[183177]: 2026-01-26 20:14:33.721 183181 DEBUG oslo_concurrency.lockutils [None req-bbd83701-c443-43ee-ae6c-da00713b0c0d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.105s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:34 compute-0 sshd-session[217494]: Connection closed by authenticating user root 142.93.140.142 port 58868 [preauth]
Jan 26 20:14:35 compute-0 nova_compute[183177]: 2026-01-26 20:14:35.071 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:35 compute-0 nova_compute[183177]: 2026-01-26 20:14:35.799 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:37 compute-0 nova_compute[183177]: 2026-01-26 20:14:37.447 183181 DEBUG oslo_concurrency.lockutils [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:37 compute-0 nova_compute[183177]: 2026-01-26 20:14:37.449 183181 DEBUG oslo_concurrency.lockutils [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:37 compute-0 nova_compute[183177]: 2026-01-26 20:14:37.958 183181 DEBUG nova.objects.instance [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lazy-loading 'flavor' on Instance uuid a43f0d21-b3f4-43af-8f64-ef721299e79a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:14:38 compute-0 nova_compute[183177]: 2026-01-26 20:14:38.977 183181 DEBUG oslo_concurrency.lockutils [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 1.528s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:39 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:14:39.105 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:14:40 compute-0 nova_compute[183177]: 2026-01-26 20:14:40.114 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:40 compute-0 nova_compute[183177]: 2026-01-26 20:14:40.202 183181 DEBUG oslo_concurrency.lockutils [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:40 compute-0 nova_compute[183177]: 2026-01-26 20:14:40.203 183181 DEBUG oslo_concurrency.lockutils [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:40 compute-0 nova_compute[183177]: 2026-01-26 20:14:40.204 183181 INFO nova.compute.manager [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Attaching volume d347cc56-7126-48bd-a195-883fe1348f79 to /dev/vdb
Jan 26 20:14:40 compute-0 nova_compute[183177]: 2026-01-26 20:14:40.205 183181 DEBUG nova.objects.instance [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lazy-loading 'flavor' on Instance uuid a43f0d21-b3f4-43af-8f64-ef721299e79a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:14:40 compute-0 nova_compute[183177]: 2026-01-26 20:14:40.801 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:40 compute-0 nova_compute[183177]: 2026-01-26 20:14:40.878 183181 DEBUG os_brick.utils [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.100', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-0.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.12/site-packages/os_brick/utils.py:177
Jan 26 20:14:40 compute-0 nova_compute[183177]: 2026-01-26 20:14:40.881 183181 INFO oslo.privsep.daemon [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmp3q48jqsk/privsep.sock']
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.641 183181 INFO oslo.privsep.daemon [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Spawned new privsep daemon via rootwrap
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.495 217500 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.501 217500 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.503 217500 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_READ_SEARCH|CAP_SYS_ADMIN/CAP_DAC_READ_SEARCH|CAP_SYS_ADMIN/none
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.503 217500 INFO oslo.privsep.daemon [-] privsep daemon running as pid 217500
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.644 217500 DEBUG oslo.privsep.daemon [-] privsep: reply[1b35cfcf-88a2-4768-8191-90ddb0695f71]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.708 217500 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.716 217500 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.716 217500 DEBUG oslo.privsep.daemon [-] privsep: reply[d619865e-533b-43b1-a936-a919672b3e18]: (4, ('InitiatorName=iqn.1994-05.com.redhat:53508fd4bf0', '')) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.719 217500 DEBUG oslo.privsep.daemon [-] privsep: Exception during request[b9ce8dac-b6e2-4326-8421-d26a4c4da76b]: [Errno 2] No such file or directory: '/dev/scini' _process_cmd /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:492
Jan 26 20:14:41 compute-0 nova_compute[183177]: Traceback (most recent call last):
Jan 26 20:14:41 compute-0 nova_compute[183177]:   File "/usr/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 489, in _process_cmd
Jan 26 20:14:41 compute-0 nova_compute[183177]:     ret = func(*f_args, **f_kwargs)
Jan 26 20:14:41 compute-0 nova_compute[183177]:           ^^^^^^^^^^^^^^^^^^^^^^^^^
Jan 26 20:14:41 compute-0 nova_compute[183177]:   File "/usr/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 270, in _wrap
Jan 26 20:14:41 compute-0 nova_compute[183177]:     return func(*args, **kwargs)
Jan 26 20:14:41 compute-0 nova_compute[183177]:            ^^^^^^^^^^^^^^^^^^^^^
Jan 26 20:14:41 compute-0 nova_compute[183177]:   File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 57, in get_guid
Jan 26 20:14:41 compute-0 nova_compute[183177]:     with open_scini_device() as fd:
Jan 26 20:14:41 compute-0 nova_compute[183177]:          ^^^^^^^^^^^^^^^^^^^
Jan 26 20:14:41 compute-0 nova_compute[183177]:   File "/usr/lib64/python3.12/contextlib.py", line 137, in __enter__
Jan 26 20:14:41 compute-0 nova_compute[183177]:     return next(self.gen)
Jan 26 20:14:41 compute-0 nova_compute[183177]:            ^^^^^^^^^^^^^^
Jan 26 20:14:41 compute-0 nova_compute[183177]:   File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 40, in open_scini_device
Jan 26 20:14:41 compute-0 nova_compute[183177]:     fd = os.open(SCINI_DEVICE_PATH, os.O_RDWR)
Jan 26 20:14:41 compute-0 nova_compute[183177]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Jan 26 20:14:41 compute-0 nova_compute[183177]: FileNotFoundError: [Errno 2] No such file or directory: '/dev/scini'
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.720 217500 DEBUG oslo.privsep.daemon [-] privsep: reply[b9ce8dac-b6e2-4326-8421-d26a4c4da76b]: (5, 'builtins.FileNotFoundError', (2, 'No such file or directory'), 'Traceback (most recent call last):\n  File "/usr/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 489, in _process_cmd\n    ret = func(*f_args, **f_kwargs)\n          ^^^^^^^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 270, in _wrap\n    return func(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 57, in get_guid\n    with open_scini_device() as fd:\n         ^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib64/python3.12/contextlib.py", line 137, in __enter__\n    return next(self.gen)\n           ^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 40, in open_scini_device\n    fd = os.open(SCINI_DEVICE_PATH, os.O_RDWR)\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\nFileNotFoundError: [Errno 2] No such file or directory: \'/dev/scini\'\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.721 183181 ERROR os_brick.initiator.connectors.scaleio [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Error querying sdc guid: [Errno 2] No such file or directory: FileNotFoundError: [Errno 2] No such file or directory
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.721 183181 INFO os_brick.initiator.connectors.scaleio [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Unable to find SDC guid: Error querying sdc guid: [Errno 2] No such file or directory
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.722 217500 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.730 217500 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.731 217500 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc5b5d7-8664-4a3b-9f69-f441c40441fa]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.732 217500 DEBUG oslo.privsep.daemon [-] privsep: reply[8b855744-46a9-4c7c-91e6-faf2b7e3edd6]: (4, 'a7a0dc8c-0440-40bb-835e-0c8b31a79067') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.732 183181 DEBUG oslo_concurrency.processutils [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.750 183181 DEBUG oslo_concurrency.processutils [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] CMD "nvme version" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.753 183181 DEBUG os_brick.initiator.connectors.lightos [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:132
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.755 183181 INFO os_brick.initiator.connectors.lightos [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Current host hostNQN nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d and IP(s) are ['38.102.83.58', '192.168.122.100', '172.19.0.100', '172.18.0.100', '172.17.0.100', 'fe80::c01:4bff:fe3d:4a17', 'fe80::fc16:3eff:fe3a:f893', 'fe80::9cfa:10ff:fe2d:96e2'] 
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.755 183181 DEBUG os_brick.initiator.connectors.lightos [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:109
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.756 183181 DEBUG os_brick.initiator.connectors.lightos [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:112
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.756 183181 DEBUG os_brick.utils [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] <== get_connector_properties: return (876ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.100', 'host': 'compute-0.ctlplane.example.com', 'multipath': True, 'enforce_multipath': True, 'initiator': 'iqn.1994-05.com.redhat:53508fd4bf0', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': 'a7a0dc8c-0440-40bb-835e-0c8b31a79067', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': '', 'host_ips': ['38.102.83.58', '192.168.122.100', '172.19.0.100', '172.18.0.100', '172.17.0.100', 'fe80::c01:4bff:fe3d:4a17', 'fe80::fc16:3eff:fe3a:f893', 'fe80::9cfa:10ff:fe2d:96e2']} trace_logging_wrapper /usr/lib/python3.12/site-packages/os_brick/utils.py:204
Jan 26 20:14:41 compute-0 nova_compute[183177]: 2026-01-26 20:14:41.756 183181 DEBUG nova.virt.block_device [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Updating existing volume attachment record: c737336e-eaa1-4de1-af2e-9e803d3450a2 _volume_attach /usr/lib/python3.12/site-packages/nova/virt/block_device.py:666
Jan 26 20:14:43 compute-0 nova_compute[183177]: 2026-01-26 20:14:43.425 183181 DEBUG oslo_concurrency.lockutils [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:14:43 compute-0 nova_compute[183177]: 2026-01-26 20:14:43.427 183181 DEBUG oslo_concurrency.lockutils [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:14:43 compute-0 nova_compute[183177]: 2026-01-26 20:14:43.427 183181 DEBUG oslo_concurrency.lockutils [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:43 compute-0 nova_compute[183177]: 2026-01-26 20:14:43.428 183181 DEBUG nova.virt.libvirt.volume.mount [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Got _HostMountState generation 0 get_state /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:91
Jan 26 20:14:43 compute-0 nova_compute[183177]: 2026-01-26 20:14:43.429 183181 DEBUG nova.virt.libvirt.volume.mount [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] _HostMountState.mount(fstype=nfs, export=172.18.0.100:/data/cinder_backend_2, vol_name=volume-d347cc56-7126-48bd-a195-883fe1348f79, /var/lib/nova/mnt/c5dc6b0fa37e191feea89c6c3cfe339f, options=[]) generation 0 mount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:288
Jan 26 20:14:43 compute-0 nova_compute[183177]: 2026-01-26 20:14:43.429 183181 DEBUG nova.virt.libvirt.volume.mount [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Mounting /var/lib/nova/mnt/c5dc6b0fa37e191feea89c6c3cfe339f generation 0 mount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:301
Jan 26 20:14:43 compute-0 kernel: FS-Cache: Loaded
Jan 26 20:14:43 compute-0 kernel: Key type dns_resolver registered
Jan 26 20:14:43 compute-0 kernel: NFS: Registering the id_resolver key type
Jan 26 20:14:43 compute-0 kernel: Key type id_resolver registered
Jan 26 20:14:43 compute-0 kernel: Key type id_legacy registered
Jan 26 20:14:43 compute-0 nfsrahead[217546]: setting /var/lib/nova/mnt/c5dc6b0fa37e191feea89c6c3cfe339f readahead to 128
Jan 26 20:14:43 compute-0 nova_compute[183177]: 2026-01-26 20:14:43.976 183181 DEBUG nova.virt.libvirt.volume.mount [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] _HostMountState.mount() for /var/lib/nova/mnt/c5dc6b0fa37e191feea89c6c3cfe339f generation 0 completed successfully mount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:334
Jan 26 20:14:44 compute-0 systemd[1]: Starting libvirt secret daemon...
Jan 26 20:14:44 compute-0 systemd[1]: Started libvirt secret daemon.
Jan 26 20:14:44 compute-0 nova_compute[183177]: 2026-01-26 20:14:44.052 183181 DEBUG nova.virt.libvirt.guest [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] attach device xml: <disk type="file" device="disk">
Jan 26 20:14:44 compute-0 nova_compute[183177]:   <driver name="qemu" type="raw" cache="none" io="native"/>
Jan 26 20:14:44 compute-0 nova_compute[183177]:   <alias name="ua-d347cc56-7126-48bd-a195-883fe1348f79"/>
Jan 26 20:14:44 compute-0 nova_compute[183177]:   <source file="/var/lib/nova/mnt/c5dc6b0fa37e191feea89c6c3cfe339f/volume-d347cc56-7126-48bd-a195-883fe1348f79"/>
Jan 26 20:14:44 compute-0 nova_compute[183177]:   <target dev="vdb" bus="virtio"/>
Jan 26 20:14:44 compute-0 nova_compute[183177]:   <serial>d347cc56-7126-48bd-a195-883fe1348f79</serial>
Jan 26 20:14:44 compute-0 nova_compute[183177]: </disk>
Jan 26 20:14:44 compute-0 nova_compute[183177]:  attach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:336
Jan 26 20:14:44 compute-0 nova_compute[183177]: ========================================================================
Jan 26 20:14:44 compute-0 nova_compute[183177]: ====                        Guru Meditation                         ====
Jan 26 20:14:44 compute-0 nova_compute[183177]: ========================================================================
Jan 26 20:14:44 compute-0 nova_compute[183177]: ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ========================================================================
Jan 26 20:14:44 compute-0 nova_compute[183177]: ====                            Package                             ====
Jan 26 20:14:44 compute-0 nova_compute[183177]: ========================================================================
Jan 26 20:14:44 compute-0 nova_compute[183177]: product = OpenStack Compute
Jan 26 20:14:44 compute-0 nova_compute[183177]: vendor = RDO
Jan 26 20:14:44 compute-0 nova_compute[183177]: version = 32.1.0-0.20251105112212.710ffbb.el10
Jan 26 20:14:44 compute-0 nova_compute[183177]: ========================================================================
Jan 26 20:14:44 compute-0 nova_compute[183177]: ====                            Threads                             ====
Jan 26 20:14:44 compute-0 nova_compute[183177]: ========================================================================
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140172414932672                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:214 in _native_thread
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `libvirt.virEventRunDefaultImpl()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/site-packages/libvirt.py:441 in virEventRunDefaultImpl
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `ret = libvirtmod.virEventRunDefaultImpl()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140172423325376                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140172431718080                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140172918232768                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140172926625472                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140172935018176                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140172943410880                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140172951803584                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140172960196288                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140172968588992                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140173522212544                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140173530605248                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140173538997952                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140173547390656                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140173555783360                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140173564176064                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140173572568768                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140173789083328                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140173797492416                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140173805901504                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140173814310592                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `msg = _reqq.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/queue.py:171 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.not_empty.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `waiter.acquire()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                  Thread #140173975719552                   ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_reports/guru_meditation_report.py:178 in _handler
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `cls.handle_signal(version, service_name, log_dir, None)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_reports/guru_meditation_report.py:217 in handle_signal
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `res = cls(version, frame).run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_reports/guru_meditation_report.py:266 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return super().run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_reports/report.py:76 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return "\n".join(str(sect) for sect in self.sections)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_reports/report.py:76 in <genexpr>
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return "\n".join(str(sect) for sect in self.sections)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_reports/report.py:101 in __str__
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.view(self.generator())`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_reports/report.py:130 in newgen
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `res = gen()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_reports/generators/threading.py:67 in __call__
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `thread_id: tm.ThreadModel(thread_id, stack)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ========================================================================
Jan 26 20:14:44 compute-0 nova_compute[183177]: ====                         Green Threads                          ====
Jan 26 20:14:44 compute-0 nova_compute[183177]: ========================================================================
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/bin/nova-compute:8 in <module>
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `sys.exit(main())`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/cmd/compute.py:62 in main
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `service.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/service.py:335 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `_launcher.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:300 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `status, signo = self._wait_for_exit_or_signal()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:278 in _wait_for_exit_or_signal
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `super().wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:213 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.services.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:690 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.tg.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:368 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._wait_threads()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:343 in _wait_threads
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._perform_action_on_threads(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:270 in _perform_action_on_threads
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `action_func(x)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:344 in <lambda>
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `lambda x: x.wait(),`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:63 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.thread.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:232 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self._exit_event.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/event.py:124 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:577 in poll
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.conn.consume(timeout=current_timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1477 in consume
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.ensure(_consume,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1173 in ensure
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `ret, channel = autoretry_method()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/kombu/connection.py:556 in _ensured
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return fun(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/kombu/connection.py:639 in __call__
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return fun(*args, channel=channels[0], **kwargs), channels[0]`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1162 in execute_method
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `method()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1464 in _consume
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.connection.drain_events(timeout=poll_timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/kombu/connection.py:341 in drain_events
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.transport.drain_events(self.connection, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/kombu/transport/pyamqp.py:171 in drain_events
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return connection.drain_events(**kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/amqp/connection.py:526 in drain_events
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `while not self.blocking_read(timeout):`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/amqp/connection.py:531 in blocking_read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `frame = self.transport.read_frame()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/amqp/transport.py:294 in read_frame
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `frame_header = read(7, True)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/amqp/transport.py:574 in _read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `s = recv(n - len(rbuf))  # see note above`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/ssl.py:196 in read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self._call_trampolining(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/ssl.py:169 in _call_trampolining
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `trampoline(self,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1393 in _heartbeat_thread_job
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._heartbeat_exit_event.wait(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:655 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `signaled = self._cond.wait(timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:359 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `gotit = waiter.acquire(True, timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/semaphore.py:107 in acquire
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `hubs.get_hub().switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1393 in _heartbeat_thread_job
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._heartbeat_exit_event.wait(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:655 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `signaled = self._cond.wait(timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:359 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `gotit = waiter.acquire(True, timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/semaphore.py:107 in acquire
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `hubs.get_hub().switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1393 in _heartbeat_thread_job
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._heartbeat_exit_event.wait(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:655 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `signaled = self._cond.wait(timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:359 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `gotit = waiter.acquire(True, timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/semaphore.py:107 in acquire
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `hubs.get_hub().switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1393 in _heartbeat_thread_job
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._heartbeat_exit_event.wait(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:655 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `signaled = self._cond.wait(timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:359 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `gotit = waiter.acquire(True, timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/semaphore.py:107 in acquire
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `hubs.get_hub().switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:154 in _reader_main
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `for msg in reader:`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:91 in __next__
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `buf = self.readsock.recv(4096)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:352 in recv
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self._recv_loop(self.fd.recv, b'', bufsize, flags)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:346 in _recv_loop
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._read_trampoline()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:314 in _read_trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._trampoline(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:206 in _trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return trampoline(fd, read=read, write=write, timeout=timeout,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:154 in _reader_main
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `for msg in reader:`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:91 in __next__
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `buf = self.readsock.recv(4096)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:352 in recv
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self._recv_loop(self.fd.recv, b'', bufsize, flags)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:346 in _recv_loop
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._read_trampoline()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:314 in _read_trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._trampoline(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:206 in _trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return trampoline(fd, read=read, write=write, timeout=timeout,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:154 in _reader_main
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `for msg in reader:`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:91 in __next__
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `buf = self.readsock.recv(4096)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:352 in recv
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self._recv_loop(self.fd.recv, b'', bufsize, flags)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:346 in _recv_loop
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._read_trampoline()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:314 in _read_trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._trampoline(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:206 in _trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return trampoline(fd, read=read, write=write, timeout=timeout,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:267 in logger
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `for line in f:`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:105 in readinto
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `data = self.read(up_to)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:84 in read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return _original_os.read(self._fileno, size)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/os.py:47 in read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `hubs.trampoline(fd, read=True)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:267 in logger
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `for line in f:`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:105 in readinto
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `data = self.read(up_to)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:84 in read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return _original_os.read(self._fileno, size)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/os.py:47 in read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `hubs.trampoline(fd, read=True)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:267 in logger
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `for line in f:`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:105 in readinto
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `data = self.read(up_to)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:84 in read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return _original_os.read(self._fileno, size)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/os.py:47 in read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `hubs.trampoline(fd, read=True)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_utils/excutils.py:257 in wrapper
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return infunc(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/base.py:294 in _runner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `incoming = self._poll_style_listener.poll(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/base.py:42 in wrapper
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `message = func(in_self, timeout=watch.leftover(True))`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:429 in poll
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.conn.consume(timeout=min(self._current_timeout, left))`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1477 in consume
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.ensure(_consume,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1173 in ensure
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `ret, channel = autoretry_method()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/kombu/connection.py:556 in _ensured
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return fun(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/kombu/connection.py:639 in __call__
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return fun(*args, channel=channels[0], **kwargs), channels[0]`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1162 in execute_method
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `method()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1464 in _consume
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.connection.drain_events(timeout=poll_timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/kombu/connection.py:341 in drain_events
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.transport.drain_events(self.connection, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/kombu/transport/pyamqp.py:171 in drain_events
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return connection.drain_events(**kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/amqp/connection.py:526 in drain_events
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `while not self.blocking_read(timeout):`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/amqp/connection.py:531 in blocking_read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `frame = self.transport.read_frame()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/amqp/transport.py:294 in read_frame
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `frame_header = read(7, True)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/amqp/transport.py:574 in _read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `s = recv(n - len(rbuf))  # see note above`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/ssl.py:196 in read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self._call_trampolining(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/ssl.py:169 in _call_trampolining
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `trampoline(self,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bootstrap_inner()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._target(*self._args, **self._kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/connection.py:108 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.poller.block()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/site-packages/ovs/poller.py:231 in block
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `events = self.poll.poll(self.timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib64/python3.12/site-packages/ovs/poller.py:137 in poll
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `rlist, wlist, xlist = select.select(self.rlist,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/select.py:80 in select
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenpool.py:87 in _spawn_n_impl
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/futurist/_green.py:69 in __call__
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.work.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/futurist/_utils.py:45 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = self.fn(*self.args, **self.kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/utils.py:584 in context_wrapper
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:225 in _dispatch_thread
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._dispatch_events()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:393 in _dispatch_events
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `_c = self._event_notify_recv.read(1)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:84 in read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return _original_os.read(self._fileno, size)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/green/os.py:47 in read
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `hubs.trampoline(fd, read=True)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenpool.py:87 in _spawn_n_impl
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/futurist/_green.py:69 in __call__
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.work.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/futurist/_utils.py:45 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = self.fn(*self.args, **self.kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/utils.py:584 in context_wrapper
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:233 in _conn_event_thread
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._dispatch_conn_event()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:239 in _dispatch_conn_event
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `handler = self._conn_event_handler_queue.get()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/queue.py:321 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return waiter.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/queue.py:140 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return get_hub().switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenpool.py:87 in _spawn_n_impl
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `func(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/futurist/_green.py:69 in __call__
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.work.run()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/futurist/_utils.py:45 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = self.fn(*self.args, **self.kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/rpc/server.py:174 in _process_incoming
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `res = self.dispatcher.dispatch(message)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py:309 in dispatch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self._do_dispatch(endpoint, method, ctxt, args)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py:229 in _do_dispatch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = func(ctxt, **new_args)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/exception_wrapper.py:63 in wrapped
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return f(self, context, *args, **kw)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/compute/utils.py:1483 in decorated_function
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return function(self, context, *args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/compute/manager.py:203 in decorated_function
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return function(self, context, *args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/compute/manager.py:8098 in attach_volume
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `do_attach_volume(context, instance, driver_bdm)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:415 in inner
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return f(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/compute/manager.py:8093 in do_attach_volume
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self._attach_volume(context, instance, driver_bdm)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/compute/manager.py:8112 in _attach_volume
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `bdm.attach(context, instance, self.volume_api, self.driver,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/virt/block_device.py:46 in wrapped
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `ret_val = method(obj, context, *args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/virt/block_device.py:769 in attach
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._do_attach(context, instance, volume, volume_api,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/virt/block_device.py:754 in _do_attach
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._volume_attach(context, volume, connector, instance,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/virt/block_device.py:692 in _volume_attach
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `virt_driver.attach_volume(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2317 in attach_volume
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `instance.device_metadata = self._build_device_metadata(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13101 in _build_device_metadata
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `vifs = objects.VirtualInterfaceList.get_by_instance_uuid(context,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py:175 in wrapper
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = cls.indirection_api.object_class_action_versions(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py:240 in object_class_action_versions
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return cctxt.call(context, 'object_class_action_versions',`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py:180 in call
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = self.transport._send(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/transport.py:123 in _send
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self._driver.send(target, ctxt, message,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:794 in send
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self._send(target, ctxt, message, wait_for_reply, timeout,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:783 in _send
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = self._waiter.wait(msg_id, timeout,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:654 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `message = self.waiters.get(msg_id, timeout=timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:519 in get
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `time.sleep(0.5)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/os_brick/utils.py:49 in _sleep
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `_time_sleep(secs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:45 in sleep
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:272 in main
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = function(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:161 in _run_loop
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._sleep(idle)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:109 in _sleep
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._abort.wait(timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_utils/eventletutils.py:178 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `event.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/event.py:124 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:272 in main
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = function(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:161 in _run_loop
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._sleep(idle)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:109 in _sleep
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._abort.wait(timeout)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_utils/eventletutils.py:178 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `event.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/event.py:124 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:272 in main
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = function(*args, **kwargs)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:725 in run_service
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `done.wait()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/event.py:124 in wait
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `result = hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:352 in run
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self.fire_timers(self.clock())`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:471 in fire_timers
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `timer()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/timer.py:59 in __call__
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `cb(*args, **kw)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:56 in tpool_trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `_c = _rsock.recv(1)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:352 in recv
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self._recv_loop(self.fd.recv, b'', bufsize, flags)`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:346 in _recv_loop
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._read_trampoline()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:314 in _read_trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `self._trampoline(`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:206 in _trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return trampoline(fd, read=read, write=write, timeout=timeout,`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return hub.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 26 20:14:44 compute-0 nova_compute[183177]:     `return self.greenlet.switch()`
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: No Traceback!
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: No Traceback!
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: No Traceback!
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: No Traceback!
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: No Traceback!
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ------                        Green Thread                        ------
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: No Traceback!
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ========================================================================
Jan 26 20:14:44 compute-0 nova_compute[183177]: ====                           Processes                            ====
Jan 26 20:14:44 compute-0 nova_compute[183177]: ========================================================================
Jan 26 20:14:44 compute-0 nova_compute[183177]: Process 183181 (under 183179) [ run by: nova (42436), state: running ]
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ========================================================================
Jan 26 20:14:44 compute-0 nova_compute[183177]: ====                         Configuration                          ====
Jan 26 20:14:44 compute-0 nova_compute[183177]: ========================================================================
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: api: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   compute_link_prefix = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01
Jan 26 20:14:44 compute-0 nova_compute[183177]:   dhcp_domain = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enable_instance_password = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   glance_link_prefix = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   instance_list_cells_batch_fixed_size = 100
Jan 26 20:14:44 compute-0 nova_compute[183177]:   instance_list_cells_batch_strategy = distributed
Jan 26 20:14:44 compute-0 nova_compute[183177]:   instance_list_per_project_cells = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   list_records_by_skipping_down_cells = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   local_metadata_per_cell = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_limit = 1000
Jan 26 20:14:44 compute-0 nova_compute[183177]:   metadata_cache_expiration = 15
Jan 26 20:14:44 compute-0 nova_compute[183177]:   neutron_default_project_id = default
Jan 26 20:14:44 compute-0 nova_compute[183177]:   response_validation = warn
Jan 26 20:14:44 compute-0 nova_compute[183177]:   use_neutron_default_nets = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vendordata_dynamic_connect_timeout = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vendordata_dynamic_failure_fatal = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vendordata_dynamic_read_timeout = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vendordata_dynamic_ssl_certfile = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vendordata_dynamic_targets = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vendordata_jsonfile_path = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vendordata_providers = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     StaticJSON
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: api_database: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   asyncio_connection = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   asyncio_slave_connection = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   backend = sqlalchemy
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connection = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connection_debug = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connection_parameters = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connection_recycle_time = 3600
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connection_trace = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   db_inc_retry_interval = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   db_max_retries = 20
Jan 26 20:14:44 compute-0 nova_compute[183177]:   db_max_retry_interval = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   db_retry_interval = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_overflow = 50
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_pool_size = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_retries = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   mysql_sql_mode = TRADITIONAL
Jan 26 20:14:44 compute-0 nova_compute[183177]:   mysql_wsrep_sync_wait = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   pool_timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retry_interval = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   slave_connection = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   sqlite_synchronous = True
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: barbican: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_endpoint = http://localhost/identity/v3
Jan 26 20:14:44 compute-0 nova_compute[183177]:   barbican_api_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   barbican_endpoint = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   barbican_endpoint_type = internal
Jan 26 20:14:44 compute-0 nova_compute[183177]:   barbican_region_name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   collect-timing = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   number_of_retries = 60
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retry_delay = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   send_service_user_token = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   split-loggers = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   verify_ssl = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   verify_ssl_path = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: barbican_service_user: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_section = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_type = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   collect-timing = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   split-loggers = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: cache: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   backend = oslo_cache.dict
Jan 26 20:14:44 compute-0 nova_compute[183177]:   backend_argument = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   backend_expiration_time = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   config_prefix = cache.oslo
Jan 26 20:14:44 compute-0 nova_compute[183177]:   dead_timeout = 60.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   debug_cache_backend = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enable_retry_client = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enable_socket_keepalive = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enabled = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enforce_fips_mode = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   expiration_time = 600
Jan 26 20:14:44 compute-0 nova_compute[183177]:   hashclient_retry_attempts = 2
Jan 26 20:14:44 compute-0 nova_compute[183177]:   hashclient_retry_delay = 1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   memcache_dead_retry = 300
Jan 26 20:14:44 compute-0 nova_compute[183177]:   memcache_password = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   memcache_pool_connection_get_timeout = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   memcache_pool_flush_on_reconnect = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   memcache_pool_maxsize = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   memcache_pool_unused_timeout = 60
Jan 26 20:14:44 compute-0 nova_compute[183177]:   memcache_sasl_enabled = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   memcache_servers = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     localhost:11211
Jan 26 20:14:44 compute-0 nova_compute[183177]:   memcache_socket_timeout = 1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   memcache_username = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   proxies = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   redis_db = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   redis_password = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   redis_sentinel_service_name = mymaster
Jan 26 20:14:44 compute-0 nova_compute[183177]:   redis_sentinels = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     localhost:26379
Jan 26 20:14:44 compute-0 nova_compute[183177]:   redis_server = localhost:6379
Jan 26 20:14:44 compute-0 nova_compute[183177]:   redis_socket_timeout = 1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   redis_username = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retry_attempts = 2
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retry_delay = 0.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   socket_keepalive_count = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   socket_keepalive_idle = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   socket_keepalive_interval = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   tls_allowed_ciphers = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   tls_cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   tls_certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   tls_enabled = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   tls_keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: cinder: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_section = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_type = password
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   catalog_info = volumev3:cinderv3:internalURL
Jan 26 20:14:44 compute-0 nova_compute[183177]:   certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   collect-timing = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cross_az_attach = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   debug = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint_template = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   http_retries = 3
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   os_region_name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   split-loggers = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: compute: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   consecutive_build_service_disable_threshold = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cpu_dedicated_set = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cpu_shared_set = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   image_type_exclude_list = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_wait_for_vif_plug = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_concurrent_disk_ops = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_disk_devices_to_attach = -1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   packing_host_numa_cells_allocation_strategy = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   provider_config_location = /etc/nova/provider_config/
Jan 26 20:14:44 compute-0 nova_compute[183177]:   resource_provider_association_refresh = 300
Jan 26 20:14:44 compute-0 nova_compute[183177]:   sharing_providers_max_uuids_per_request = 200
Jan 26 20:14:44 compute-0 nova_compute[183177]:   shutdown_retry_interval = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vmdk_allowed_types = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     monolithicSparse
Jan 26 20:14:44 compute-0 nova_compute[183177]:     streamOptimized
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: conductor: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   workers = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: console: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   allowed_origins = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ssl_ciphers = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ssl_minimum_version = default
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: consoleauth: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enforce_session_timeout = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   token_ttl = 600
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: cyborg: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   collect-timing = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint-override = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   min_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   region-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retriable-status-codes = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-type = accelerator
Jan 26 20:14:44 compute-0 nova_compute[183177]:   split-loggers = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   valid-interfaces = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     internal
Jan 26 20:14:44 compute-0 nova_compute[183177]:     public
Jan 26 20:14:44 compute-0 nova_compute[183177]:   version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: database: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   asyncio_connection = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   asyncio_slave_connection = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   backend = sqlalchemy
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connection = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connection_debug = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connection_parameters = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connection_recycle_time = 3600
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connection_trace = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   db_inc_retry_interval = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   db_max_retries = 20
Jan 26 20:14:44 compute-0 nova_compute[183177]:   db_max_retry_interval = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   db_retry_interval = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_overflow = 50
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_pool_size = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_retries = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   mysql_sql_mode = TRADITIONAL
Jan 26 20:14:44 compute-0 nova_compute[183177]:   mysql_wsrep_sync_wait = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   pool_timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retry_interval = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   slave_connection = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   sqlite_synchronous = True
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: default: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   allow_resize_to_same_host = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   arq_binding_timeout = 300
Jan 26 20:14:44 compute-0 nova_compute[183177]:   backdoor_port = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   backdoor_socket = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   block_device_allocate_retries = 60
Jan 26 20:14:44 compute-0 nova_compute[183177]:   block_device_allocate_retries_interval = 3
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cell_worker_thread_pool_size = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cert = self.pem
Jan 26 20:14:44 compute-0 nova_compute[183177]:   compute_driver = libvirt.LibvirtDriver
Jan 26 20:14:44 compute-0 nova_compute[183177]:   compute_monitors = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   config-dir = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     /etc/nova/nova.conf.d
Jan 26 20:14:44 compute-0 nova_compute[183177]:   config-file = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     /etc/nova/nova-compute.conf
Jan 26 20:14:44 compute-0 nova_compute[183177]:     /etc/nova/nova.conf
Jan 26 20:14:44 compute-0 nova_compute[183177]:   config_drive_format = iso9660
Jan 26 20:14:44 compute-0 nova_compute[183177]:   config_source = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   console_host = compute-0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   control_exchange = nova
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cpu_allocation_ratio = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   daemon = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   debug = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default_access_ip_network_name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default_availability_zone = nova
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default_ephemeral_format = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default_green_pool_size = 1000
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default_log_levels = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     amqp=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     amqplib=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     boto=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     dogpile.core.dogpile=INFO
Jan 26 20:14:44 compute-0 nova_compute[183177]:     glanceclient=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     iso8601=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     keystoneauth=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     keystonemiddleware=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     oslo.cache=INFO
Jan 26 20:14:44 compute-0 nova_compute[183177]:     oslo.messaging=INFO
Jan 26 20:14:44 compute-0 nova_compute[183177]:     oslo.privsep.daemon=INFO
Jan 26 20:14:44 compute-0 nova_compute[183177]:     oslo_messaging=INFO
Jan 26 20:14:44 compute-0 nova_compute[183177]:     oslo_policy=INFO
Jan 26 20:14:44 compute-0 nova_compute[183177]:     qpid=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     requests.packages.urllib3.connectionpool=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     requests.packages.urllib3.util.retry=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     routes.middleware=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     sqlalchemy=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     stevedore=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     suds=INFO
Jan 26 20:14:44 compute-0 nova_compute[183177]:     taskflow=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     urllib3.connectionpool=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     urllib3.util.retry=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:     websocket=WARN
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default_schedule_zone = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default_thread_pool_size = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   disk_allocation_ratio = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enable_new_services = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   executor_thread_pool_size = 64
Jan 26 20:14:44 compute-0 nova_compute[183177]:   fatal_deprecations = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   flat_injected = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   force_config_drive = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   force_raw_images = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   graceful_shutdown_timeout = 60
Jan 26 20:14:44 compute-0 nova_compute[183177]:   heal_instance_info_cache_interval = -1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   host = compute-0.ctlplane.example.com
Jan 26 20:14:44 compute-0 nova_compute[183177]:   initial_cpu_allocation_ratio = 4.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   initial_disk_allocation_ratio = 0.9
Jan 26 20:14:44 compute-0 nova_compute[183177]:   initial_ram_allocation_ratio = 1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   injected_network_template = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template
Jan 26 20:14:44 compute-0 nova_compute[183177]:   instance_build_timeout = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   instance_delete_interval = 300
Jan 26 20:14:44 compute-0 nova_compute[183177]:   instance_format = [instance: %(uuid)s] 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   instance_name_template = instance-%08x
Jan 26 20:14:44 compute-0 nova_compute[183177]:   instance_usage_audit = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   instance_usage_audit_period = month
Jan 26 20:14:44 compute-0 nova_compute[183177]:   instance_uuid_format = [instance: %(uuid)s] 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   instances_path = /var/lib/nova/instances
Jan 26 20:14:44 compute-0 nova_compute[183177]:   internal_service_availability_zone = internal
Jan 26 20:14:44 compute-0 nova_compute[183177]:   key = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_retry_count = 30
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log-config-append = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log-date-format = %Y-%m-%d %H:%M:%S
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log-dir = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log-file = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log_color = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log_options = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log_rotate_interval = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log_rotate_interval_type = days
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log_rotation_type = size
Jan 26 20:14:44 compute-0 nova_compute[183177]:   logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s
Jan 26 20:14:44 compute-0 nova_compute[183177]:   logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d
Jan 26 20:14:44 compute-0 nova_compute[183177]:   logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s
Jan 26 20:14:44 compute-0 nova_compute[183177]:   logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s
Jan 26 20:14:44 compute-0 nova_compute[183177]:   logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s
Jan 26 20:14:44 compute-0 nova_compute[183177]:   long_rpc_timeout = 1800
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_concurrent_builds = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_concurrent_live_migrations = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_concurrent_snapshots = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_local_block_devices = 3
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_logfile_count = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_logfile_size_mb = 20
Jan 26 20:14:44 compute-0 nova_compute[183177]:   maximum_instance_delete_attempts = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   migrate_max_retries = -1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   mkisofs_cmd = /usr/bin/mkisofs
Jan 26 20:14:44 compute-0 nova_compute[183177]:   my_block_storage_ip = 192.168.122.100
Jan 26 20:14:44 compute-0 nova_compute[183177]:   my_ip = 192.168.122.100
Jan 26 20:14:44 compute-0 nova_compute[183177]:   my_shared_fs_storage_ip = 192.168.122.100
Jan 26 20:14:44 compute-0 nova_compute[183177]:   network_allocate_retries = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   non_inheritable_image_properties = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     bittorrent
Jan 26 20:14:44 compute-0 nova_compute[183177]:     cache_in_nova
Jan 26 20:14:44 compute-0 nova_compute[183177]:   osapi_compute_unique_server_name_scope = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   password_length = 12
Jan 26 20:14:44 compute-0 nova_compute[183177]:   periodic_enable = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   periodic_fuzzy_delay = 60
Jan 26 20:14:44 compute-0 nova_compute[183177]:   pointer_model = usbtablet
Jan 26 20:14:44 compute-0 nova_compute[183177]:   preallocate_images = none
Jan 26 20:14:44 compute-0 nova_compute[183177]:   publish_errors = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   pybasedir = /usr/lib/python3.12/site-packages
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ram_allocation_ratio = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rate_limit_burst = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rate_limit_except_level = CRITICAL
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rate_limit_interval = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   reboot_timeout = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   reclaim_instance_interval = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   record = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   reimage_timeout_per_gb = 20
Jan 26 20:14:44 compute-0 nova_compute[183177]:   report_interval = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rescue_timeout = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   reserved_host_cpus = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   reserved_host_disk_mb = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   reserved_host_memory_mb = 512
Jan 26 20:14:44 compute-0 nova_compute[183177]:   reserved_huge_pages = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   resize_confirm_window = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   resize_fs_using_block_device = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   resume_guests_state_on_host_boot = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rootwrap_config = /etc/nova/rootwrap.conf
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rpc_ping_enabled = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rpc_response_timeout = 60
Jan 26 20:14:44 compute-0 nova_compute[183177]:   run_external_periodic_tasks = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   running_deleted_instance_action = reap
Jan 26 20:14:44 compute-0 nova_compute[183177]:   running_deleted_instance_poll_interval = 1800
Jan 26 20:14:44 compute-0 nova_compute[183177]:   running_deleted_instance_timeout = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   scheduler_instance_sync_interval = 120
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service_down_time = 60
Jan 26 20:14:44 compute-0 nova_compute[183177]:   servicegroup_driver = db
Jan 26 20:14:44 compute-0 nova_compute[183177]:   shell_completion = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   shelved_offload_time = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   shelved_poll_interval = 3600
Jan 26 20:14:44 compute-0 nova_compute[183177]:   shutdown_timeout = 60
Jan 26 20:14:44 compute-0 nova_compute[183177]:   source_is_ipv6 = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ssl_only = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   state_path = /var/lib/nova
Jan 26 20:14:44 compute-0 nova_compute[183177]:   sync_power_state_interval = 600
Jan 26 20:14:44 compute-0 nova_compute[183177]:   sync_power_state_pool_size = 1000
Jan 26 20:14:44 compute-0 nova_compute[183177]:   syslog-log-facility = LOG_USER
Jan 26 20:14:44 compute-0 nova_compute[183177]:   tempdir = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   thread_pool_statistic_period = -1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout_nbd = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   transport_url = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   update_resources_interval = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   use-journal = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   use-json = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   use-syslog = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   use_cow_images = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   use_rootwrap_daemon = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   use_stderr = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vcpu_pin_set = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vif_plugging_is_fatal = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vif_plugging_timeout = 300
Jan 26 20:14:44 compute-0 nova_compute[183177]:   virt_mkfs = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   volume_usage_poll_interval = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   watch-log-file = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   web = /usr/share/spice-html5
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: devices: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enabled_mdev_types = 
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ephemeral_storage_encryption: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cipher = aes-xts-plain64
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default_format = luks
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enabled = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   key_size = 512
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: filter_scheduler: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   aggregate_image_properties_isolation_namespace = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   aggregate_image_properties_isolation_separator = .
Jan 26 20:14:44 compute-0 nova_compute[183177]:   available_filters = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     nova.scheduler.filters.all_filters
Jan 26 20:14:44 compute-0 nova_compute[183177]:   build_failure_weight_multiplier = 1000000.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cpu_weight_multiplier = 1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cross_cell_move_weight_multiplier = 1000000.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   disk_weight_multiplier = 1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enabled_filters = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     ComputeCapabilitiesFilter
Jan 26 20:14:44 compute-0 nova_compute[183177]:     ComputeFilter
Jan 26 20:14:44 compute-0 nova_compute[183177]:     ImagePropertiesFilter
Jan 26 20:14:44 compute-0 nova_compute[183177]:     ServerGroupAffinityFilter
Jan 26 20:14:44 compute-0 nova_compute[183177]:     ServerGroupAntiAffinityFilter
Jan 26 20:14:44 compute-0 nova_compute[183177]:   host_subset_size = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   hypervisor_version_weight_multiplier = 1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   image_properties_default_architecture = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   image_props_weight_multiplier = 0.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   image_props_weight_setting = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   io_ops_weight_multiplier = -1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   isolated_hosts = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   isolated_images = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_instances_per_host = 50
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_io_ops_per_host = 8
Jan 26 20:14:44 compute-0 nova_compute[183177]:   num_instances_weight_multiplier = 0.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   pci_in_placement = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   pci_weight_multiplier = 1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ram_weight_multiplier = 1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   restrict_isolated_hosts_to_isolated_images = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   shuffle_best_same_weighed_hosts = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   soft_affinity_weight_multiplier = 1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   soft_anti_affinity_weight_multiplier = 1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   track_instance_changes = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   weight_classes = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     nova.scheduler.weights.all_weighers
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: glance: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   api_servers = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   collect-timing = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   debug = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default_trusted_certificate_ids = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enable_certificate_validation = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enable_rbd_download = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint-override = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   min_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   num_retries = 3
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rbd_ceph_conf = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rbd_connect_timeout = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rbd_pool = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rbd_user = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   region-name = regionOne
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retriable-status-codes = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-type = image
Jan 26 20:14:44 compute-0 nova_compute[183177]:   split-loggers = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   valid-interfaces = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     internal
Jan 26 20:14:44 compute-0 nova_compute[183177]:   verify_glance_signatures = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: guestfs: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   debug = False
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: image_cache: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   manager_interval = 2400
Jan 26 20:14:44 compute-0 nova_compute[183177]:   precache_concurrency = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   remove_unused_base_images = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   remove_unused_original_minimum_age_seconds = 86400
Jan 26 20:14:44 compute-0 nova_compute[183177]:   remove_unused_resized_minimum_age_seconds = 3600
Jan 26 20:14:44 compute-0 nova_compute[183177]:   subdirectory_name = _base
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: ironic: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   api_max_retries = 60
Jan 26 20:14:44 compute-0 nova_compute[183177]:   api_retry_interval = 2
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_section = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_type = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   collect-timing = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   conductor_group = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint-override = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   min_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   peer_list = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   region-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retriable-status-codes = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   serial_console_state_timeout = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-type = baremetal
Jan 26 20:14:44 compute-0 nova_compute[183177]:   shard = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   split-loggers = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   valid-interfaces = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     internal
Jan 26 20:14:44 compute-0 nova_compute[183177]:     public
Jan 26 20:14:44 compute-0 nova_compute[183177]:   version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: key_manager: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   backend = barbican
Jan 26 20:14:44 compute-0 nova_compute[183177]:   fixed_key = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: keystone: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   collect-timing = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint-override = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   min_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   region-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retriable-status-codes = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-type = identity
Jan 26 20:14:44 compute-0 nova_compute[183177]:   split-loggers = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   valid-interfaces = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     internal
Jan 26 20:14:44 compute-0 nova_compute[183177]:     public
Jan 26 20:14:44 compute-0 nova_compute[183177]:   version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: libvirt: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ceph_mount_options = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ceph_mount_point_base = /var/lib/nova/mnt
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connection_uri = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cpu_mode = custom
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cpu_model_extra_flags = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cpu_models = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     Nehalem
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cpu_power_governor_high = performance
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cpu_power_governor_low = powersave
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cpu_power_management = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cpu_power_management_strategy = cpu_state
Jan 26 20:14:44 compute-0 nova_compute[183177]:   device_detach_attempts = 8
Jan 26 20:14:44 compute-0 nova_compute[183177]:   device_detach_timeout = 20
Jan 26 20:14:44 compute-0 nova_compute[183177]:   disk_cachemodes = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   disk_prefix = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enabled_perf_events = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   file_backed_memory = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   gid_maps = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   hw_disk_discard = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   hw_machine_type = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     x86_64=q35
Jan 26 20:14:44 compute-0 nova_compute[183177]:   images_rbd_ceph_conf = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   images_rbd_glance_copy_poll_interval = 15
Jan 26 20:14:44 compute-0 nova_compute[183177]:   images_rbd_glance_copy_timeout = 600
Jan 26 20:14:44 compute-0 nova_compute[183177]:   images_rbd_glance_store_name = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   images_rbd_pool = rbd
Jan 26 20:14:44 compute-0 nova_compute[183177]:   images_type = qcow2
Jan 26 20:14:44 compute-0 nova_compute[183177]:   images_volume_group = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   inject_key = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   inject_partition = -2
Jan 26 20:14:44 compute-0 nova_compute[183177]:   inject_password = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   iscsi_iface = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   iser_use_multipath = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_bandwidth = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_completion_timeout = 800
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_downtime = 500
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_downtime_delay = 75
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_downtime_steps = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_inbound_addr = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_permit_auto_converge = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_permit_post_copy = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_scheme = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_timeout_action = force_complete
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_tunnelled = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_uri = qemu+tls://%s/system
Jan 26 20:14:44 compute-0 nova_compute[183177]:   live_migration_with_native_tls = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_queues = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   mem_stats_period_seconds = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   migration_inbound_addr = 192.168.122.100
Jan 26 20:14:44 compute-0 nova_compute[183177]:   nfs_mount_options = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   nfs_mount_point_base = /var/lib/nova/mnt
Jan 26 20:14:44 compute-0 nova_compute[183177]:   num_aoe_discover_tries = 3
Jan 26 20:14:44 compute-0 nova_compute[183177]:   num_iser_scan_tries = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   num_memory_encrypted_guests = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   num_nvme_discover_tries = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   num_pcie_ports = 24
Jan 26 20:14:44 compute-0 nova_compute[183177]:   num_volume_scan_tries = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   pmem_namespaces = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   quobyte_client_cfg = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   quobyte_mount_point_base = /var/lib/nova/mnt
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rbd_connect_timeout = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rbd_destroy_volume_retries = 12
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rbd_destroy_volume_retry_interval = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rbd_secret_uuid = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rbd_user = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   realtime_scheduler_priority = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   remote_filesystem_transport = ssh
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rescue_image_id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rescue_kernel_id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rescue_ramdisk_id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rng_dev_path = /dev/urandom
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rx_queue_size = 512
Jan 26 20:14:44 compute-0 nova_compute[183177]:   smbfs_mount_options = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   smbfs_mount_point_base = /var/lib/nova/mnt
Jan 26 20:14:44 compute-0 nova_compute[183177]:   snapshot_compression = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   snapshot_image_format = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   snapshots_directory = /var/lib/nova/instances/snapshots
Jan 26 20:14:44 compute-0 nova_compute[183177]:   sparse_logical_volumes = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   swtpm_enabled = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   swtpm_group = tss
Jan 26 20:14:44 compute-0 nova_compute[183177]:   swtpm_user = tss
Jan 26 20:14:44 compute-0 nova_compute[183177]:   sysinfo_serial = unique
Jan 26 20:14:44 compute-0 nova_compute[183177]:   tb_cache_size = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   tx_queue_size = 512
Jan 26 20:14:44 compute-0 nova_compute[183177]:   uid_maps = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   use_virtio_for_bridges = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   virt_type = kvm
Jan 26 20:14:44 compute-0 nova_compute[183177]:   volume_clear = zero
Jan 26 20:14:44 compute-0 nova_compute[183177]:   volume_clear_size = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   volume_enforce_multipath = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   volume_use_multipath = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vzstorage_cache_path = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vzstorage_mount_group = qemu
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vzstorage_mount_opts = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vzstorage_mount_perms = 0770
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vzstorage_mount_point_base = /var/lib/nova/mnt
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vzstorage_mount_user = stack
Jan 26 20:14:44 compute-0 nova_compute[183177]:   wait_soft_reboot_seconds = 120
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: manila: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_section = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_type = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   collect-timing = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint-override = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   min_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   region-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retriable-status-codes = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-type = shared-file-system
Jan 26 20:14:44 compute-0 nova_compute[183177]:   share_apply_policy_timeout = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   split-loggers = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   valid-interfaces = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     internal
Jan 26 20:14:44 compute-0 nova_compute[183177]:     public
Jan 26 20:14:44 compute-0 nova_compute[183177]:   version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: metrics: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   required = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   weight_multiplier = 1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   weight_of_unavailable = -10000.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   weight_setting = 
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: mks: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enabled = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   mksproxy_base_url = http://127.0.0.1:6090/
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: neutron: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth-url = https://keystone-internal.openstack.svc:5000
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_section = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_type = password
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   collect-timing = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default-domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default-domain-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default_floating_pool = nova
Jan 26 20:14:44 compute-0 nova_compute[183177]:   domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   domain-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint-override = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   extension_sync_interval = 600
Jan 26 20:14:44 compute-0 nova_compute[183177]:   http_retries = 3
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   metadata_proxy_shared_secret = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   min_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ovs_bridge = br-int
Jan 26 20:14:44 compute-0 nova_compute[183177]:   password = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   physnets = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-domain-name = Default
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-name = service
Jan 26 20:14:44 compute-0 nova_compute[183177]:   region-name = regionOne
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retriable-status-codes = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-type = network
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service_metadata_proxy = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   split-loggers = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   system-scope = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   trust-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user-domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user-domain-name = Default
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   username = nova
Jan 26 20:14:44 compute-0 nova_compute[183177]:   valid-interfaces = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     internal
Jan 26 20:14:44 compute-0 nova_compute[183177]:   version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: neutron_tunnel: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   numa_nodes = 
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: notifications: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   bdms_in_notifications = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default_level = INFO
Jan 26 20:14:44 compute-0 nova_compute[183177]:   include_share_mapping = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   notification_format = both
Jan 26 20:14:44 compute-0 nova_compute[183177]:   notify_on_state_change = vm_and_task_state
Jan 26 20:14:44 compute-0 nova_compute[183177]:   versioned_notifications_topics = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     versioned_notifications
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: nova_sys_admin: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   capabilities = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     0
Jan 26 20:14:44 compute-0 nova_compute[183177]:     1
Jan 26 20:14:44 compute-0 nova_compute[183177]:     12
Jan 26 20:14:44 compute-0 nova_compute[183177]:     2
Jan 26 20:14:44 compute-0 nova_compute[183177]:     21
Jan 26 20:14:44 compute-0 nova_compute[183177]:     3
Jan 26 20:14:44 compute-0 nova_compute[183177]:   group = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   helper_command = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log_daemon_traceback = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   logger_name = oslo_privsep.daemon
Jan 26 20:14:44 compute-0 nova_compute[183177]:   thread_pool_size = 8
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: os_brick: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   lock_path = /var/lib/nova/tmp
Jan 26 20:14:44 compute-0 nova_compute[183177]:   wait_mpath_device_attempts = 4
Jan 26 20:14:44 compute-0 nova_compute[183177]:   wait_mpath_device_interval = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: os_vif_linux_bridge: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   flat_interface = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   forward_bridge_interface = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     all
Jan 26 20:14:44 compute-0 nova_compute[183177]:   iptables_bottom_regex = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   iptables_drop_action = DROP
Jan 26 20:14:44 compute-0 nova_compute[183177]:   iptables_top_regex = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   network_device_mtu = 1500
Jan 26 20:14:44 compute-0 nova_compute[183177]:   use_ipv6 = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vlan_interface = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: os_vif_ovs: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default_qos_type = linux-noop
Jan 26 20:14:44 compute-0 nova_compute[183177]:   isolate_vif = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   network_device_mtu = 1500
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ovs_vsctl_timeout = 120
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ovsdb_connection = tcp:127.0.0.1:6640
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ovsdb_interface = native
Jan 26 20:14:44 compute-0 nova_compute[183177]:   per_port_bridge = False
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: oslo_concurrency: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   disable_process_locking = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   lock_path = /var/lib/nova/tmp
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: oslo_limit: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth-url = https://keystone-internal.openstack.svc:5000
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_section = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_type = password
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   collect-timing = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default-domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default-domain-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   domain-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint-override = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint_id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint_interface = internal
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint_region_name = regionOne
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint_service_name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint_service_type = compute
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max-version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   min-version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   password = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-domain-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   region-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retriable-status-codes = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-type = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   split-loggers = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   system-scope = all
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   trust-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user-domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user-domain-name = Default
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   username = nova
Jan 26 20:14:44 compute-0 nova_compute[183177]:   valid-interfaces = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: oslo_messaging_metrics: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   metrics_buffer_size = 1000
Jan 26 20:14:44 compute-0 nova_compute[183177]:   metrics_enabled = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   metrics_process_name = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   metrics_socket_file = /var/tmp/metrics_collector.sock
Jan 26 20:14:44 compute-0 nova_compute[183177]:   metrics_thread_stop_timeout = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: oslo_messaging_notifications: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   driver = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     messagingv2
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retry = -1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   topics = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     notifications
Jan 26 20:14:44 compute-0 nova_compute[183177]:   transport_url = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: oslo_messaging_rabbit: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   amqp_auto_delete = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   amqp_durable_queues = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   conn_pool_min_size = 2
Jan 26 20:14:44 compute-0 nova_compute[183177]:   conn_pool_ttl = 1200
Jan 26 20:14:44 compute-0 nova_compute[183177]:   direct_mandatory_flag = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enable_cancel_on_failover = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   heartbeat_in_pthread = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   heartbeat_rate = 3
Jan 26 20:14:44 compute-0 nova_compute[183177]:   heartbeat_timeout_threshold = 60
Jan 26 20:14:44 compute-0 nova_compute[183177]:   hostname = compute-0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   kombu_compression = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   kombu_failover_strategy = round-robin
Jan 26 20:14:44 compute-0 nova_compute[183177]:   kombu_missing_consumer_retry_timeout = 60
Jan 26 20:14:44 compute-0 nova_compute[183177]:   kombu_reconnect_delay = 1.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   kombu_reconnect_splay = 0.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   processname = nova-compute
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rabbit_ha_queues = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rabbit_interval_max = 30
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rabbit_login_method = AMQPLAIN
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rabbit_qos_prefetch_count = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rabbit_quorum_delivery_limit = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rabbit_quorum_max_memory_bytes = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rabbit_quorum_max_memory_length = 0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rabbit_quorum_queue = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rabbit_retry_backoff = 2
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rabbit_retry_interval = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rabbit_stream_fanout = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rabbit_transient_queues_ttl = 1800
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rabbit_transient_quorum_queue = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   rpc_conn_pool_size = 30
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ssl = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ssl_ca_file = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ssl_cert_file = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ssl_enforce_fips_mode = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ssl_key_file = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ssl_version = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   use_queue_manager = False
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: oslo_middleware: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   http_basic_auth_user_file = /etc/htpasswd
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: oslo_policy: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enforce_new_defaults = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enforce_scope = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   policy_default_rule = default
Jan 26 20:14:44 compute-0 nova_compute[183177]:   policy_dirs = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     policy.d
Jan 26 20:14:44 compute-0 nova_compute[183177]:   policy_file = policy.yaml
Jan 26 20:14:44 compute-0 nova_compute[183177]:   remote_content_type = application/x-www-form-urlencoded
Jan 26 20:14:44 compute-0 nova_compute[183177]:   remote_ssl_ca_crt_file = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   remote_ssl_client_crt_file = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   remote_ssl_client_key_file = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   remote_ssl_verify_server_crt = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   remote_timeout = 60.0
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: oslo_reports: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   file_event_handler = /var/lib/nova
Jan 26 20:14:44 compute-0 nova_compute[183177]:   file_event_handler_interval = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log_dir = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: oslo_versionedobjects: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   fatal_exception_format_errors = False
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: pci: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   alias = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   device_spec = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   report_in_placement = False
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: placement: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth-url = https://keystone-internal.openstack.svc:5000
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_section = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_type = password
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   collect-timing = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connect-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default-domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default-domain-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   domain-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   endpoint-override = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   min_version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   password = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-domain-name = Default
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-name = service
Jan 26 20:14:44 compute-0 nova_compute[183177]:   region-name = regionOne
Jan 26 20:14:44 compute-0 nova_compute[183177]:   retriable-status-codes = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   service-type = placement
Jan 26 20:14:44 compute-0 nova_compute[183177]:   split-loggers = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retries = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   status-code-retry-delay = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   system-scope = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   trust-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user-domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user-domain-name = Default
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   username = nova
Jan 26 20:14:44 compute-0 nova_compute[183177]:   valid-interfaces = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     internal
Jan 26 20:14:44 compute-0 nova_compute[183177]:   version = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: privsep_osbrick: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   capabilities = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     2
Jan 26 20:14:44 compute-0 nova_compute[183177]:     21
Jan 26 20:14:44 compute-0 nova_compute[183177]:   group = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   helper_command = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log_daemon_traceback = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   logger_name = os_brick.privileged
Jan 26 20:14:44 compute-0 nova_compute[183177]:   thread_pool_size = 8
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: quota: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cores = 20
Jan 26 20:14:44 compute-0 nova_compute[183177]:   count_usage_from_placement = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   driver = nova.quota.DbQuotaDriver
Jan 26 20:14:44 compute-0 nova_compute[183177]:   injected_file_content_bytes = 10240
Jan 26 20:14:44 compute-0 nova_compute[183177]:   injected_file_path_length = 255
Jan 26 20:14:44 compute-0 nova_compute[183177]:   injected_files = 5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   instances = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   key_pairs = 100
Jan 26 20:14:44 compute-0 nova_compute[183177]:   metadata_items = 128
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ram = 51200
Jan 26 20:14:44 compute-0 nova_compute[183177]:   recheck_quota = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   server_group_members = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   server_groups = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   unified_limits_resource_list = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     servers
Jan 26 20:14:44 compute-0 nova_compute[183177]:   unified_limits_resource_strategy = require
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: scheduler: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   discover_hosts_in_cells_interval = -1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enable_isolated_aggregate_filtering = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   image_metadata_prefilter = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   limit_tenants_to_placement_aggregate = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_attempts = 3
Jan 26 20:14:44 compute-0 nova_compute[183177]:   max_placement_results = 1000
Jan 26 20:14:44 compute-0 nova_compute[183177]:   placement_aggregate_required_for_tenants = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   query_placement_for_image_type_support = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   query_placement_for_routed_network_aggregates = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   workers = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: serial_console: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   base_url = ws://127.0.0.1:6083/
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enabled = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   port_range = 10000:20000
Jan 26 20:14:44 compute-0 nova_compute[183177]:   proxyclient_address = 127.0.0.1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   serialproxy_host = 0.0.0.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   serialproxy_port = 6083
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: service_user: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth-url = https://keystone-internal.openstack.svc:5000
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_section = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_type = password
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   collect-timing = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default-domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   default-domain-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   domain-name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   password = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-domain-name = Default
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   project-name = service
Jan 26 20:14:44 compute-0 nova_compute[183177]:   send_service_user_token = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   split-loggers = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   system-scope = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   trust-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user-domain-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user-domain-name = Default
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user-id = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   username = nova
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: spice: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   agent_enabled = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enabled = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html
Jan 26 20:14:44 compute-0 nova_compute[183177]:   html5proxy_host = 0.0.0.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   html5proxy_port = 6082
Jan 26 20:14:44 compute-0 nova_compute[183177]:   image_compression = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   jpeg_compression = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   playback_compression = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   require_secure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   server_listen = 127.0.0.1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   server_proxyclient_address = 127.0.0.1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   spice_direct_proxy_base_url = http://127.0.0.1:13002/nova
Jan 26 20:14:44 compute-0 nova_compute[183177]:   streaming_mode = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   zlib_compression = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: upgrade_levels: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   baseapi = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   compute = auto
Jan 26 20:14:44 compute-0 nova_compute[183177]:   conductor = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   scheduler = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: vault: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   approle_role_id = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   approle_secret_id = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   kv_mountpoint = secret
Jan 26 20:14:44 compute-0 nova_compute[183177]:   kv_path = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   kv_version = 2
Jan 26 20:14:44 compute-0 nova_compute[183177]:   namespace = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   root_token_id = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ssl_ca_crt_file = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = 60.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   use_ssl = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vault_url = http://127.0.0.1:8200
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: vendordata_dynamic_auth: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_section = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_type = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cafile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   certfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   collect-timing = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   keyfile = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   split-loggers = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   timeout = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: vif_plug_linux_bridge_privileged: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   capabilities = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     12
Jan 26 20:14:44 compute-0 nova_compute[183177]:   group = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   helper_command = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log_daemon_traceback = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   logger_name = oslo_privsep.daemon
Jan 26 20:14:44 compute-0 nova_compute[183177]:   thread_pool_size = 8
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: vif_plug_ovs_privileged: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   capabilities = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     1
Jan 26 20:14:44 compute-0 nova_compute[183177]:     12
Jan 26 20:14:44 compute-0 nova_compute[183177]:   group = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   helper_command = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   log_daemon_traceback = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   logger_name = oslo_privsep.daemon
Jan 26 20:14:44 compute-0 nova_compute[183177]:   thread_pool_size = 8
Jan 26 20:14:44 compute-0 nova_compute[183177]:   user = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: vmware: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   api_retry_count = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ca_file = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cache_prefix = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cluster_name = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   connection_pool_size = 10
Jan 26 20:14:44 compute-0 nova_compute[183177]:   console_delay_seconds = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   datastore_regex = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   host_ip = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   host_password = ***
Jan 26 20:14:44 compute-0 nova_compute[183177]:   host_port = 443
Jan 26 20:14:44 compute-0 nova_compute[183177]:   host_username = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   insecure = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   integration_bridge = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   maximum_objects = 100
Jan 26 20:14:44 compute-0 nova_compute[183177]:   pbm_default_policy = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   pbm_enabled = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   pbm_wsdl_location = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   serial_log_dir = /opt/vmware/vspc
Jan 26 20:14:44 compute-0 nova_compute[183177]:   serial_port_proxy_uri = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   serial_port_service_uri = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   task_poll_interval = 0.5
Jan 26 20:14:44 compute-0 nova_compute[183177]:   use_linked_clone = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vnc_keymap = en-us
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vnc_port = 5900
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vnc_port_total = 10000
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: vnc: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   auth_schemes = 
Jan 26 20:14:44 compute-0 nova_compute[183177]:     none
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enabled = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   novncproxy_base_url = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html
Jan 26 20:14:44 compute-0 nova_compute[183177]:   novncproxy_host = 0.0.0.0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   novncproxy_port = 6080
Jan 26 20:14:44 compute-0 nova_compute[183177]:   server_listen = ::0
Jan 26 20:14:44 compute-0 nova_compute[183177]:   server_proxyclient_address = 192.168.122.100
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vencrypt_ca_certs = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vencrypt_client_cert = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   vencrypt_client_key = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: workarounds: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   disable_compute_service_check_for_ffu = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   disable_deep_image_inspection = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   disable_fallback_pcpu_query = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   disable_group_policy_check_upcall = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   disable_libvirt_livesnapshot = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   disable_rootwrap = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enable_numa_live_migration = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   enable_qemu_monitor_announce_self = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ensure_libvirt_rbd_instance_dir_cleanup = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   handle_virt_lifecycle_events = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   libvirt_disable_apic = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   never_download_image_if_on_rbd = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   qemu_monitor_announce_self_count = 3
Jan 26 20:14:44 compute-0 nova_compute[183177]:   qemu_monitor_announce_self_interval = 1
Jan 26 20:14:44 compute-0 nova_compute[183177]:   reserve_disk_resource_for_image_cache = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   skip_cpu_compare_at_startup = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   skip_cpu_compare_on_dest = True
Jan 26 20:14:44 compute-0 nova_compute[183177]:   skip_hypervisor_version_check_on_lm = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   skip_reserve_in_use_ironic_nodes = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   unified_limits_count_pcpu_as_vcpu = False
Jan 26 20:14:44 compute-0 nova_compute[183177]:   wait_for_vif_plugged_event_during_hard_reboot = 
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: wsgi: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   api_paste_config = api-paste.ini
Jan 26 20:14:44 compute-0 nova_compute[183177]:   secure_proxy_ssl_header = None
Jan 26 20:14:44 compute-0 nova_compute[183177]: 
Jan 26 20:14:44 compute-0 nova_compute[183177]: zvm: 
Jan 26 20:14:44 compute-0 nova_compute[183177]:   ca_file = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   cloud_connector_url = None
Jan 26 20:14:44 compute-0 nova_compute[183177]:   image_tmp_path = /var/lib/nova/images
Jan 26 20:14:44 compute-0 nova_compute[183177]:   reachable_timeout = 300
Jan 26 20:14:45 compute-0 ovn_controller[95396]: 2026-01-26T20:14:45Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3a:f8:93 10.100.0.8
Jan 26 20:14:45 compute-0 ovn_controller[95396]: 2026-01-26T20:14:45Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3a:f8:93 10.100.0.8
Jan 26 20:14:45 compute-0 nova_compute[183177]: 2026-01-26 20:14:45.115 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:45 compute-0 nova_compute[183177]: 2026-01-26 20:14:45.805 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:46 compute-0 nova_compute[183177]: 2026-01-26 20:14:46.157 183181 DEBUG nova.virt.libvirt.driver [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:14:46 compute-0 nova_compute[183177]: 2026-01-26 20:14:46.157 183181 DEBUG nova.virt.libvirt.driver [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:14:46 compute-0 nova_compute[183177]: 2026-01-26 20:14:46.158 183181 DEBUG nova.virt.libvirt.driver [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 26 20:14:46 compute-0 nova_compute[183177]: 2026-01-26 20:14:46.158 183181 DEBUG nova.virt.libvirt.driver [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] No VIF found with MAC fa:16:3e:3a:f8:93, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 26 20:14:48 compute-0 nova_compute[183177]: 2026-01-26 20:14:48.573 183181 DEBUG oslo_concurrency.lockutils [None req-4b4eefa1-4544-41d6-99bd-60f4f169e712 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 8.370s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:14:50 compute-0 nova_compute[183177]: 2026-01-26 20:14:50.116 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:50 compute-0 nova_compute[183177]: 2026-01-26 20:14:50.806 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:55 compute-0 nova_compute[183177]: 2026-01-26 20:14:55.120 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:55 compute-0 podman[217570]: 2026-01-26 20:14:55.362854925 +0000 UTC m=+0.096016718 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter)
Jan 26 20:14:55 compute-0 podman[217571]: 2026-01-26 20:14:55.385654029 +0000 UTC m=+0.113931381 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 20:14:55 compute-0 podman[217569]: 2026-01-26 20:14:55.412470953 +0000 UTC m=+0.143527750 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 20:14:55 compute-0 nova_compute[183177]: 2026-01-26 20:14:55.808 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:14:59 compute-0 ovn_controller[95396]: 2026-01-26T20:14:59Z|00256|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 26 20:14:59 compute-0 podman[192499]: time="2026-01-26T20:14:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:14:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:14:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:14:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:14:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2646 "" "Go-http-client/1.1"
Jan 26 20:15:00 compute-0 nova_compute[183177]: 2026-01-26 20:15:00.121 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:00 compute-0 nova_compute[183177]: 2026-01-26 20:15:00.811 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:01 compute-0 podman[217636]: 2026-01-26 20:15:01.331590812 +0000 UTC m=+0.076576285 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 20:15:01 compute-0 openstack_network_exporter[195363]: ERROR   20:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:15:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:15:01 compute-0 openstack_network_exporter[195363]: ERROR   20:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:15:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:15:01 compute-0 sshd-session[217634]: Connection closed by authenticating user root 188.166.116.149 port 60436 [preauth]
Jan 26 20:15:05 compute-0 nova_compute[183177]: 2026-01-26 20:15:05.124 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:05 compute-0 nova_compute[183177]: 2026-01-26 20:15:05.813 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:08 compute-0 nova_compute[183177]: 2026-01-26 20:15:08.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:15:08 compute-0 nova_compute[183177]: 2026-01-26 20:15:08.184 183181 DEBUG oslo_concurrency.lockutils [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:15:08 compute-0 nova_compute[183177]: 2026-01-26 20:15:08.184 183181 DEBUG oslo_concurrency.lockutils [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:15:08 compute-0 nova_compute[183177]: 2026-01-26 20:15:08.697 183181 DEBUG nova.objects.instance [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lazy-loading 'flavor' on Instance uuid a43f0d21-b3f4-43af-8f64-ef721299e79a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:15:09 compute-0 nova_compute[183177]: 2026-01-26 20:15:09.214 183181 INFO nova.compute.manager [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Detaching volume d347cc56-7126-48bd-a195-883fe1348f79
Jan 26 20:15:09 compute-0 nova_compute[183177]: 2026-01-26 20:15:09.427 183181 INFO nova.virt.block_device [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Attempting to driver detach volume d347cc56-7126-48bd-a195-883fe1348f79 from mountpoint /dev/vdb
Jan 26 20:15:09 compute-0 nova_compute[183177]: 2026-01-26 20:15:09.436 183181 DEBUG nova.virt.libvirt.driver [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Found disk vdb by alias ua-d347cc56-7126-48bd-a195-883fe1348f79 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Jan 26 20:15:09 compute-0 nova_compute[183177]: 2026-01-26 20:15:09.439 183181 DEBUG nova.virt.libvirt.driver [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Found disk vdb by alias ua-d347cc56-7126-48bd-a195-883fe1348f79 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Jan 26 20:15:09 compute-0 nova_compute[183177]: 2026-01-26 20:15:09.440 183181 DEBUG nova.virt.libvirt.driver [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Attempting to detach device vdb from instance a43f0d21-b3f4-43af-8f64-ef721299e79a from the persistent domain config. _detach_from_persistent /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2576
Jan 26 20:15:09 compute-0 nova_compute[183177]: 2026-01-26 20:15:09.440 183181 DEBUG nova.virt.libvirt.guest [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] detach device xml: <disk type="file" device="disk">
Jan 26 20:15:09 compute-0 nova_compute[183177]:   <driver name="qemu" type="raw" cache="none" io="native"/>
Jan 26 20:15:09 compute-0 nova_compute[183177]:   <alias name="ua-d347cc56-7126-48bd-a195-883fe1348f79"/>
Jan 26 20:15:09 compute-0 nova_compute[183177]:   <source file="/var/lib/nova/mnt/c5dc6b0fa37e191feea89c6c3cfe339f/volume-d347cc56-7126-48bd-a195-883fe1348f79"/>
Jan 26 20:15:09 compute-0 nova_compute[183177]:   <target dev="vdb" bus="virtio"/>
Jan 26 20:15:09 compute-0 nova_compute[183177]:   <serial>d347cc56-7126-48bd-a195-883fe1348f79</serial>
Jan 26 20:15:09 compute-0 nova_compute[183177]:   <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 26 20:15:09 compute-0 nova_compute[183177]: </disk>
Jan 26 20:15:09 compute-0 nova_compute[183177]:  detach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:466
Jan 26 20:15:09 compute-0 nova_compute[183177]: 2026-01-26 20:15:09.449 183181 DEBUG nova.virt.libvirt.driver [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Found disk vdb by alias ua-d347cc56-7126-48bd-a195-883fe1348f79 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Jan 26 20:15:09 compute-0 nova_compute[183177]: 2026-01-26 20:15:09.449 183181 WARNING nova.virt.libvirt.driver [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Failed to detach device vdb from instance a43f0d21-b3f4-43af-8f64-ef721299e79a from the persistent domain config. Libvirt did not report any error but the device is still in the config.
Jan 26 20:15:09 compute-0 nova_compute[183177]: 2026-01-26 20:15:09.450 183181 DEBUG nova.virt.libvirt.driver [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] (1/8): Attempting to detach device vdb with device alias ua-d347cc56-7126-48bd-a195-883fe1348f79 from instance a43f0d21-b3f4-43af-8f64-ef721299e79a from the live domain config. _detach_from_live_with_retry /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2612
Jan 26 20:15:09 compute-0 nova_compute[183177]: 2026-01-26 20:15:09.450 183181 DEBUG nova.virt.libvirt.guest [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] detach device xml: <disk type="file" device="disk">
Jan 26 20:15:09 compute-0 nova_compute[183177]:   <driver name="qemu" type="raw" cache="none" io="native"/>
Jan 26 20:15:09 compute-0 nova_compute[183177]:   <alias name="ua-d347cc56-7126-48bd-a195-883fe1348f79"/>
Jan 26 20:15:09 compute-0 nova_compute[183177]:   <source file="/var/lib/nova/mnt/c5dc6b0fa37e191feea89c6c3cfe339f/volume-d347cc56-7126-48bd-a195-883fe1348f79"/>
Jan 26 20:15:09 compute-0 nova_compute[183177]:   <target dev="vdb" bus="virtio"/>
Jan 26 20:15:09 compute-0 nova_compute[183177]:   <serial>d347cc56-7126-48bd-a195-883fe1348f79</serial>
Jan 26 20:15:09 compute-0 nova_compute[183177]:   <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 26 20:15:09 compute-0 nova_compute[183177]: </disk>
Jan 26 20:15:09 compute-0 nova_compute[183177]:  detach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:466
Jan 26 20:15:09 compute-0 nova_compute[183177]: 2026-01-26 20:15:09.513 183181 DEBUG nova.virt.libvirt.driver [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Start waiting for the detach event from libvirt for device vdb with device alias ua-d347cc56-7126-48bd-a195-883fe1348f79 for instance a43f0d21-b3f4-43af-8f64-ef721299e79a _detach_from_live_and_wait_for_event /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2688
Jan 26 20:15:10 compute-0 nova_compute[183177]: 2026-01-26 20:15:10.177 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:10 compute-0 nova_compute[183177]: 2026-01-26 20:15:10.814 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:11 compute-0 sshd-session[217662]: Connection closed by authenticating user root 142.93.140.142 port 46926 [preauth]
Jan 26 20:15:15 compute-0 nova_compute[183177]: 2026-01-26 20:15:15.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:15:15 compute-0 nova_compute[183177]: 2026-01-26 20:15:15.225 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:15 compute-0 nova_compute[183177]: 2026-01-26 20:15:15.817 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:20 compute-0 nova_compute[183177]: 2026-01-26 20:15:20.226 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:20 compute-0 nova_compute[183177]: 2026-01-26 20:15:20.818 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:21 compute-0 nova_compute[183177]: 2026-01-26 20:15:21.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:15:21 compute-0 nova_compute[183177]: 2026-01-26 20:15:21.672 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:15:21 compute-0 nova_compute[183177]: 2026-01-26 20:15:21.673 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:15:21 compute-0 nova_compute[183177]: 2026-01-26 20:15:21.673 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:15:21 compute-0 nova_compute[183177]: 2026-01-26 20:15:21.673 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:15:22 compute-0 nova_compute[183177]: 2026-01-26 20:15:22.731 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:15:22 compute-0 nova_compute[183177]: 2026-01-26 20:15:22.818 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:15:22 compute-0 nova_compute[183177]: 2026-01-26 20:15:22.819 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:15:22 compute-0 nova_compute[183177]: 2026-01-26 20:15:22.890 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:15:23 compute-0 nova_compute[183177]: 2026-01-26 20:15:23.109 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:15:23 compute-0 nova_compute[183177]: 2026-01-26 20:15:23.111 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:15:23 compute-0 nova_compute[183177]: 2026-01-26 20:15:23.145 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:15:23 compute-0 nova_compute[183177]: 2026-01-26 20:15:23.146 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5486MB free_disk=73.06093215942383GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:15:23 compute-0 nova_compute[183177]: 2026-01-26 20:15:23.146 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:15:23 compute-0 nova_compute[183177]: 2026-01-26 20:15:23.146 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:15:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:24.116 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:15:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:24.117 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:15:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:24.117 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:15:24 compute-0 nova_compute[183177]: 2026-01-26 20:15:24.261 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Instance a43f0d21-b3f4-43af-8f64-ef721299e79a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 26 20:15:24 compute-0 nova_compute[183177]: 2026-01-26 20:15:24.262 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:15:24 compute-0 nova_compute[183177]: 2026-01-26 20:15:24.262 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:15:23 up  1:39,  0 user,  load average: 0.12, 0.21, 0.25\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_38ae655879f546f9b658b4587909e2ba': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:15:24 compute-0 nova_compute[183177]: 2026-01-26 20:15:24.388 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:15:24 compute-0 nova_compute[183177]: 2026-01-26 20:15:24.897 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:15:25 compute-0 nova_compute[183177]: 2026-01-26 20:15:25.229 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:25 compute-0 nova_compute[183177]: 2026-01-26 20:15:25.410 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:15:25 compute-0 nova_compute[183177]: 2026-01-26 20:15:25.410 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.264s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:15:25 compute-0 nova_compute[183177]: 2026-01-26 20:15:25.820 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:26 compute-0 podman[217675]: 2026-01-26 20:15:26.319182074 +0000 UTC m=+0.061375544 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 26 20:15:26 compute-0 podman[217674]: 2026-01-26 20:15:26.345272098 +0000 UTC m=+0.084844457 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public)
Jan 26 20:15:26 compute-0 podman[217673]: 2026-01-26 20:15:26.36391356 +0000 UTC m=+0.104194839 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260120, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 26 20:15:26 compute-0 nova_compute[183177]: 2026-01-26 20:15:26.406 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:15:26 compute-0 nova_compute[183177]: 2026-01-26 20:15:26.406 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:15:26 compute-0 nova_compute[183177]: 2026-01-26 20:15:26.407 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:15:26 compute-0 nova_compute[183177]: 2026-01-26 20:15:26.407 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:15:26 compute-0 nova_compute[183177]: 2026-01-26 20:15:26.407 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:15:29 compute-0 nova_compute[183177]: 2026-01-26 20:15:29.515 183181 WARNING nova.virt.libvirt.driver [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Waiting for libvirt event about the detach of device vdb with device alias ua-d347cc56-7126-48bd-a195-883fe1348f79 from instance a43f0d21-b3f4-43af-8f64-ef721299e79a is timed out.
Jan 26 20:15:29 compute-0 nova_compute[183177]: 2026-01-26 20:15:29.523 183181 INFO nova.virt.libvirt.driver [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Successfully detached device vdb from instance a43f0d21-b3f4-43af-8f64-ef721299e79a from the live domain config.
Jan 26 20:15:29 compute-0 nova_compute[183177]: 2026-01-26 20:15:29.526 183181 DEBUG nova.virt.libvirt.volume.mount [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Got _HostMountState generation 0 get_state /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:91
Jan 26 20:15:29 compute-0 nova_compute[183177]: 2026-01-26 20:15:29.527 183181 DEBUG nova.virt.libvirt.volume.mount [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] _HostMountState.umount(vol_name=volume-d347cc56-7126-48bd-a195-883fe1348f79, mountpoint=/var/lib/nova/mnt/c5dc6b0fa37e191feea89c6c3cfe339f) generation 0 umount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:349
Jan 26 20:15:29 compute-0 nova_compute[183177]: 2026-01-26 20:15:29.528 183181 DEBUG nova.virt.libvirt.volume.mount [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Unmounting /var/lib/nova/mnt/c5dc6b0fa37e191feea89c6c3cfe339f generation 0 _real_umount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:382
Jan 26 20:15:29 compute-0 systemd[1]: var-lib-nova-mnt-c5dc6b0fa37e191feea89c6c3cfe339f.mount: Deactivated successfully.
Jan 26 20:15:29 compute-0 nova_compute[183177]: 2026-01-26 20:15:29.580 183181 DEBUG nova.virt.libvirt.volume.mount [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] _HostMountState.umount() for /var/lib/nova/mnt/c5dc6b0fa37e191feea89c6c3cfe339f generation 0 completed successfully umount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:372
Jan 26 20:15:29 compute-0 podman[192499]: time="2026-01-26T20:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:15:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16574 "" "Go-http-client/1.1"
Jan 26 20:15:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2645 "" "Go-http-client/1.1"
Jan 26 20:15:30 compute-0 nova_compute[183177]: 2026-01-26 20:15:30.232 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:30 compute-0 nova_compute[183177]: 2026-01-26 20:15:30.823 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:30 compute-0 nova_compute[183177]: 2026-01-26 20:15:30.922 183181 DEBUG oslo_concurrency.lockutils [None req-8d03e4f6-61ed-4e26-bbbc-062cce28bcd4 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 22.738s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:15:31 compute-0 nova_compute[183177]: 2026-01-26 20:15:31.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:15:31 compute-0 openstack_network_exporter[195363]: ERROR   20:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:15:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:15:31 compute-0 openstack_network_exporter[195363]: ERROR   20:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:15:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:15:32 compute-0 podman[217745]: 2026-01-26 20:15:32.372771318 +0000 UTC m=+0.118083583 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 20:15:32 compute-0 nova_compute[183177]: 2026-01-26 20:15:32.686 183181 DEBUG oslo_concurrency.lockutils [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:15:32 compute-0 nova_compute[183177]: 2026-01-26 20:15:32.686 183181 DEBUG oslo_concurrency.lockutils [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:15:32 compute-0 nova_compute[183177]: 2026-01-26 20:15:32.687 183181 DEBUG oslo_concurrency.lockutils [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:15:32 compute-0 nova_compute[183177]: 2026-01-26 20:15:32.687 183181 DEBUG oslo_concurrency.lockutils [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:15:32 compute-0 nova_compute[183177]: 2026-01-26 20:15:32.688 183181 DEBUG oslo_concurrency.lockutils [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:15:32 compute-0 nova_compute[183177]: 2026-01-26 20:15:32.703 183181 INFO nova.compute.manager [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Terminating instance
Jan 26 20:15:33 compute-0 nova_compute[183177]: 2026-01-26 20:15:33.223 183181 DEBUG nova.compute.manager [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 26 20:15:33 compute-0 kernel: tap81e834db-cc (unregistering): left promiscuous mode
Jan 26 20:15:33 compute-0 NetworkManager[55489]: <info>  [1769458533.2606] device (tap81e834db-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 26 20:15:33 compute-0 ovn_controller[95396]: 2026-01-26T20:15:33Z|00257|binding|INFO|Releasing lport 81e834db-ccbf-445f-a30e-0d88556aa09b from this chassis (sb_readonly=0)
Jan 26 20:15:33 compute-0 nova_compute[183177]: 2026-01-26 20:15:33.274 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:33 compute-0 ovn_controller[95396]: 2026-01-26T20:15:33Z|00258|binding|INFO|Setting lport 81e834db-ccbf-445f-a30e-0d88556aa09b down in Southbound
Jan 26 20:15:33 compute-0 ovn_controller[95396]: 2026-01-26T20:15:33Z|00259|binding|INFO|Removing iface tap81e834db-cc ovn-installed in OVS
Jan 26 20:15:33 compute-0 nova_compute[183177]: 2026-01-26 20:15:33.278 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:33 compute-0 nova_compute[183177]: 2026-01-26 20:15:33.291 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:33 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000022.scope: Deactivated successfully.
Jan 26 20:15:33 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000022.scope: Consumed 15.847s CPU time.
Jan 26 20:15:33 compute-0 systemd-machined[154465]: Machine qemu-25-instance-00000022 terminated.
Jan 26 20:15:33 compute-0 nova_compute[183177]: 2026-01-26 20:15:33.536 183181 INFO nova.virt.libvirt.driver [-] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Instance destroyed successfully.
Jan 26 20:15:33 compute-0 nova_compute[183177]: 2026-01-26 20:15:33.537 183181 DEBUG nova.objects.instance [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lazy-loading 'resources' on Instance uuid a43f0d21-b3f4-43af-8f64-ef721299e79a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.225 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:f8:93 10.100.0.8'], port_security=['fa:16:3e:3a:f8:93 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a43f0d21-b3f4-43af-8f64-ef721299e79a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b60a7fac-0e46-41c4-b058-76398a3bda4c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38ae655879f546f9b658b4587909e2ba', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7e3ab412-bd7e-4a46-92e3-b4c17a5c712a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af7846ee-bb5f-4892-9e05-ffc63360c016, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>], logical_port=81e834db-ccbf-445f-a30e-0d88556aa09b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fcf148bdca0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.227 104672 INFO neutron.agent.ovn.metadata.agent [-] Port 81e834db-ccbf-445f-a30e-0d88556aa09b in datapath b60a7fac-0e46-41c4-b058-76398a3bda4c unbound from our chassis
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.228 104672 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b60a7fac-0e46-41c4-b058-76398a3bda4c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.229 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[fa53f885-8d30-42e1-ab48-a7986d0dcdca]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.230 104672 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c namespace which is not needed anymore
Jan 26 20:15:34 compute-0 neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c[217478]: [NOTICE]   (217482) : haproxy version is 3.0.5-8e879a5
Jan 26 20:15:34 compute-0 neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c[217478]: [NOTICE]   (217482) : path to executable is /usr/sbin/haproxy
Jan 26 20:15:34 compute-0 neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c[217478]: [WARNING]  (217482) : Exiting Master process...
Jan 26 20:15:34 compute-0 podman[217810]: 2026-01-26 20:15:34.408349491 +0000 UTC m=+0.042934279 container kill 498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Jan 26 20:15:34 compute-0 neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c[217478]: [ALERT]    (217482) : Current worker (217484) exited with code 143 (Terminated)
Jan 26 20:15:34 compute-0 neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c[217478]: [WARNING]  (217482) : All workers exited. Exiting... (0)
Jan 26 20:15:34 compute-0 systemd[1]: libpod-498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c.scope: Deactivated successfully.
Jan 26 20:15:34 compute-0 podman[217823]: 2026-01-26 20:15:34.703485925 +0000 UTC m=+0.266351370 container died 498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.716 183181 DEBUG nova.virt.libvirt.vif [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2026-01-26T20:14:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-265117477',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-265117477',id=34,image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-26T20:14:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='38ae655879f546f9b658b4587909e2ba',ramdisk_id='',reservation_id='r-yf4nwf8w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='34c9b7fc-1b4f-4f54-95fe-dad22b7aaba5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-285404190',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-285404190-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-26T20:14:32Z,user_data=None,user_id='8e2da5d9b53344dbb3c17e8fe5cc8502',uuid=a43f0d21-b3f4-43af-8f64-ef721299e79a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "81e834db-ccbf-445f-a30e-0d88556aa09b", "address": "fa:16:3e:3a:f8:93", "network": {"id": "b60a7fac-0e46-41c4-b058-76398a3bda4c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-214112467-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62dec47b42a941889a2d13d95aaab372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81e834db-cc", "ovs_interfaceid": "81e834db-ccbf-445f-a30e-0d88556aa09b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.716 183181 DEBUG nova.network.os_vif_util [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Converting VIF {"id": "81e834db-ccbf-445f-a30e-0d88556aa09b", "address": "fa:16:3e:3a:f8:93", "network": {"id": "b60a7fac-0e46-41c4-b058-76398a3bda4c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-214112467-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "62dec47b42a941889a2d13d95aaab372", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81e834db-cc", "ovs_interfaceid": "81e834db-ccbf-445f-a30e-0d88556aa09b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.717 183181 DEBUG nova.network.os_vif_util [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:f8:93,bridge_name='br-int',has_traffic_filtering=True,id=81e834db-ccbf-445f-a30e-0d88556aa09b,network=Network(b60a7fac-0e46-41c4-b058-76398a3bda4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81e834db-cc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.717 183181 DEBUG os_vif [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:f8:93,bridge_name='br-int',has_traffic_filtering=True,id=81e834db-ccbf-445f-a30e-0d88556aa09b,network=Network(b60a7fac-0e46-41c4-b058-76398a3bda4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81e834db-cc') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.719 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.719 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81e834db-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.720 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.722 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.722 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.723 183181 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4bc3f14f-a699-4162-888d-9a6e35d3c448) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.723 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.724 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.726 183181 INFO os_vif [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:f8:93,bridge_name='br-int',has_traffic_filtering=True,id=81e834db-ccbf-445f-a30e-0d88556aa09b,network=Network(b60a7fac-0e46-41c4-b058-76398a3bda4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81e834db-cc')
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.726 183181 INFO nova.virt.libvirt.driver [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Deleting instance files /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a_del
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.727 183181 INFO nova.virt.libvirt.driver [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Deletion of /var/lib/nova/instances/a43f0d21-b3f4-43af-8f64-ef721299e79a_del complete
Jan 26 20:15:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c-userdata-shm.mount: Deactivated successfully.
Jan 26 20:15:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-1bf20d72c496a2aca67e0d43f4cc2110f49db6cbc90af15206d28c36e8f943b7-merged.mount: Deactivated successfully.
Jan 26 20:15:34 compute-0 podman[217823]: 2026-01-26 20:15:34.764290474 +0000 UTC m=+0.327155819 container cleanup 498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 20:15:34 compute-0 systemd[1]: libpod-conmon-498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c.scope: Deactivated successfully.
Jan 26 20:15:34 compute-0 podman[217835]: 2026-01-26 20:15:34.790337036 +0000 UTC m=+0.113639724 container remove 498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.802 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[0bde11cc-6433-44fc-8ede-62030f8ad09d]: (4, ("Mon Jan 26 08:15:34 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c (498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c)\n498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c\nMon Jan 26 08:15:34 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c (498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c)\n498a61fddcd40430e4d010a77758427096c67cb025ecf057942e673d683cbd9c\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.804 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[0915ca3f-c0a8-46f0-b1ea-37c92c2c4e71]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.804 104672 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b60a7fac-0e46-41c4-b058-76398a3bda4c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b60a7fac-0e46-41c4-b058-76398a3bda4c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.805 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[8c8ff1cf-1213-447c-aba7-d9ec3febbf58]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.805 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb60a7fac-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.807 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:34 compute-0 kernel: tapb60a7fac-00: left promiscuous mode
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.830 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.833 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e5c79b-3c04-448e-ab92-6678a3d9e012]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.850 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[6cc668d2-8fa9-48bc-a00c-7ee1075758c4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.851 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[58223473-6f04-44c3-a207-dbfeae91647d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.867 203984 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff374c3-dd8c-496d-a683-9db2bb25cb48]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592794, 'reachable_time': 41383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217856, 'error': None, 'target': 'ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.870 104941 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b60a7fac-0e46-41c4-b058-76398a3bda4c deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.870 104941 DEBUG oslo.privsep.daemon [-] privsep: reply[fb0212d9-95d0-4e11-908a-c70edfc82944]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 26 20:15:34 compute-0 systemd[1]: run-netns-ovnmeta\x2db60a7fac\x2d0e46\x2d41c4\x2db058\x2d76398a3bda4c.mount: Deactivated successfully.
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.939 183181 DEBUG nova.compute.manager [req-5d813ef7-1534-4167-8558-b4beaa9f2bff req-3b627974-152c-43b4-8f0d-4e6d2d7652e3 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Received event network-vif-unplugged-81e834db-ccbf-445f-a30e-0d88556aa09b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.940 183181 DEBUG oslo_concurrency.lockutils [req-5d813ef7-1534-4167-8558-b4beaa9f2bff req-3b627974-152c-43b4-8f0d-4e6d2d7652e3 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.940 183181 DEBUG oslo_concurrency.lockutils [req-5d813ef7-1534-4167-8558-b4beaa9f2bff req-3b627974-152c-43b4-8f0d-4e6d2d7652e3 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.940 183181 DEBUG oslo_concurrency.lockutils [req-5d813ef7-1534-4167-8558-b4beaa9f2bff req-3b627974-152c-43b4-8f0d-4e6d2d7652e3 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.940 183181 DEBUG nova.compute.manager [req-5d813ef7-1534-4167-8558-b4beaa9f2bff req-3b627974-152c-43b4-8f0d-4e6d2d7652e3 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] No waiting events found dispatching network-vif-unplugged-81e834db-ccbf-445f-a30e-0d88556aa09b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.941 183181 DEBUG nova.compute.manager [req-5d813ef7-1534-4167-8558-b4beaa9f2bff req-3b627974-152c-43b4-8f0d-4e6d2d7652e3 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Received event network-vif-unplugged-81e834db-ccbf-445f-a30e-0d88556aa09b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:15:34 compute-0 nova_compute[183177]: 2026-01-26 20:15:34.968 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.969 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:15:34 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:34.971 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:15:35 compute-0 nova_compute[183177]: 2026-01-26 20:15:35.252 183181 INFO nova.compute.manager [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Took 2.03 seconds to destroy the instance on the hypervisor.
Jan 26 20:15:35 compute-0 nova_compute[183177]: 2026-01-26 20:15:35.253 183181 DEBUG oslo.service.backend._eventlet.loopingcall [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 26 20:15:35 compute-0 nova_compute[183177]: 2026-01-26 20:15:35.254 183181 DEBUG nova.compute.manager [-] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 26 20:15:35 compute-0 nova_compute[183177]: 2026-01-26 20:15:35.254 183181 DEBUG nova.network.neutron [-] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 26 20:15:35 compute-0 nova_compute[183177]: 2026-01-26 20:15:35.254 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:15:35 compute-0 nova_compute[183177]: 2026-01-26 20:15:35.285 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:35 compute-0 nova_compute[183177]: 2026-01-26 20:15:35.736 183181 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 26 20:15:36 compute-0 nova_compute[183177]: 2026-01-26 20:15:36.285 183181 DEBUG nova.compute.manager [req-0008bd04-6ace-4133-a482-43a9626fd1ad req-cc1bd8c5-40bf-44f2-bb30-de2009ddd4aa 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Received event network-vif-deleted-81e834db-ccbf-445f-a30e-0d88556aa09b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:15:36 compute-0 nova_compute[183177]: 2026-01-26 20:15:36.286 183181 INFO nova.compute.manager [req-0008bd04-6ace-4133-a482-43a9626fd1ad req-cc1bd8c5-40bf-44f2-bb30-de2009ddd4aa 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Neutron deleted interface 81e834db-ccbf-445f-a30e-0d88556aa09b; detaching it from the instance and deleting it from the info cache
Jan 26 20:15:36 compute-0 nova_compute[183177]: 2026-01-26 20:15:36.286 183181 DEBUG nova.network.neutron [req-0008bd04-6ace-4133-a482-43a9626fd1ad req-cc1bd8c5-40bf-44f2-bb30-de2009ddd4aa 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:15:36 compute-0 nova_compute[183177]: 2026-01-26 20:15:36.726 183181 DEBUG nova.network.neutron [-] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 26 20:15:36 compute-0 nova_compute[183177]: 2026-01-26 20:15:36.793 183181 DEBUG nova.compute.manager [req-0008bd04-6ace-4133-a482-43a9626fd1ad req-cc1bd8c5-40bf-44f2-bb30-de2009ddd4aa 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Detach interface failed, port_id=81e834db-ccbf-445f-a30e-0d88556aa09b, reason: Instance a43f0d21-b3f4-43af-8f64-ef721299e79a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 26 20:15:37 compute-0 nova_compute[183177]: 2026-01-26 20:15:37.001 183181 DEBUG nova.compute.manager [req-1642bfc0-6bf7-4b39-a112-f269390064f4 req-06789c94-8c9b-4a74-aac4-30fc6c13ae33 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Received event network-vif-unplugged-81e834db-ccbf-445f-a30e-0d88556aa09b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 26 20:15:37 compute-0 nova_compute[183177]: 2026-01-26 20:15:37.001 183181 DEBUG oslo_concurrency.lockutils [req-1642bfc0-6bf7-4b39-a112-f269390064f4 req-06789c94-8c9b-4a74-aac4-30fc6c13ae33 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Acquiring lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:15:37 compute-0 nova_compute[183177]: 2026-01-26 20:15:37.002 183181 DEBUG oslo_concurrency.lockutils [req-1642bfc0-6bf7-4b39-a112-f269390064f4 req-06789c94-8c9b-4a74-aac4-30fc6c13ae33 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:15:37 compute-0 nova_compute[183177]: 2026-01-26 20:15:37.002 183181 DEBUG oslo_concurrency.lockutils [req-1642bfc0-6bf7-4b39-a112-f269390064f4 req-06789c94-8c9b-4a74-aac4-30fc6c13ae33 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:15:37 compute-0 nova_compute[183177]: 2026-01-26 20:15:37.002 183181 DEBUG nova.compute.manager [req-1642bfc0-6bf7-4b39-a112-f269390064f4 req-06789c94-8c9b-4a74-aac4-30fc6c13ae33 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] No waiting events found dispatching network-vif-unplugged-81e834db-ccbf-445f-a30e-0d88556aa09b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 26 20:15:37 compute-0 nova_compute[183177]: 2026-01-26 20:15:37.003 183181 DEBUG nova.compute.manager [req-1642bfc0-6bf7-4b39-a112-f269390064f4 req-06789c94-8c9b-4a74-aac4-30fc6c13ae33 0fe48ccc26774b28ab1cb3a80052cc2a 67312299a0dc49b19b5d88d614dcf421 - - default default] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Received event network-vif-unplugged-81e834db-ccbf-445f-a30e-0d88556aa09b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 26 20:15:37 compute-0 nova_compute[183177]: 2026-01-26 20:15:37.234 183181 INFO nova.compute.manager [-] [instance: a43f0d21-b3f4-43af-8f64-ef721299e79a] Took 1.98 seconds to deallocate network for instance.
Jan 26 20:15:37 compute-0 nova_compute[183177]: 2026-01-26 20:15:37.759 183181 DEBUG oslo_concurrency.lockutils [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:15:37 compute-0 nova_compute[183177]: 2026-01-26 20:15:37.759 183181 DEBUG oslo_concurrency.lockutils [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:15:37 compute-0 nova_compute[183177]: 2026-01-26 20:15:37.813 183181 DEBUG nova.compute.provider_tree [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:15:38 compute-0 nova_compute[183177]: 2026-01-26 20:15:38.319 183181 DEBUG nova.scheduler.client.report [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:15:38 compute-0 nova_compute[183177]: 2026-01-26 20:15:38.835 183181 DEBUG oslo_concurrency.lockutils [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.076s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:15:38 compute-0 nova_compute[183177]: 2026-01-26 20:15:38.860 183181 INFO nova.scheduler.client.report [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Deleted allocations for instance a43f0d21-b3f4-43af-8f64-ef721299e79a
Jan 26 20:15:39 compute-0 nova_compute[183177]: 2026-01-26 20:15:39.726 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:39 compute-0 sshd-session[217858]: Connection closed by authenticating user root 188.166.116.149 port 39728 [preauth]
Jan 26 20:15:39 compute-0 nova_compute[183177]: 2026-01-26 20:15:39.889 183181 DEBUG oslo_concurrency.lockutils [None req-1c151e8b-f709-42ea-aaf8-48114ebe838d 8e2da5d9b53344dbb3c17e8fe5cc8502 38ae655879f546f9b658b4587909e2ba - - default default] Lock "a43f0d21-b3f4-43af-8f64-ef721299e79a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.202s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:15:39 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:15:39.973 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:15:40 compute-0 nova_compute[183177]: 2026-01-26 20:15:40.287 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:44 compute-0 nova_compute[183177]: 2026-01-26 20:15:44.729 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:45 compute-0 nova_compute[183177]: 2026-01-26 20:15:45.290 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:46 compute-0 sshd-session[217861]: Connection closed by authenticating user root 142.93.140.142 port 56550 [preauth]
Jan 26 20:15:48 compute-0 nova_compute[183177]: 2026-01-26 20:15:48.216 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:49 compute-0 nova_compute[183177]: 2026-01-26 20:15:49.734 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:50 compute-0 nova_compute[183177]: 2026-01-26 20:15:50.327 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:54 compute-0 nova_compute[183177]: 2026-01-26 20:15:54.736 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:55 compute-0 nova_compute[183177]: 2026-01-26 20:15:55.327 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:57 compute-0 podman[217865]: 2026-01-26 20:15:57.332087999 +0000 UTC m=+0.069662278 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, name=ubi9-minimal)
Jan 26 20:15:57 compute-0 podman[217866]: 2026-01-26 20:15:57.346204389 +0000 UTC m=+0.068873297 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Jan 26 20:15:57 compute-0 podman[217864]: 2026-01-26 20:15:57.396236268 +0000 UTC m=+0.136000326 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS)
Jan 26 20:15:59 compute-0 podman[192499]: time="2026-01-26T20:15:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:15:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:15:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:15:59 compute-0 nova_compute[183177]: 2026-01-26 20:15:59.749 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:15:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:15:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2186 "" "Go-http-client/1.1"
Jan 26 20:16:00 compute-0 nova_compute[183177]: 2026-01-26 20:16:00.329 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:01 compute-0 openstack_network_exporter[195363]: ERROR   20:16:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:16:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:16:01 compute-0 openstack_network_exporter[195363]: ERROR   20:16:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:16:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:16:03 compute-0 podman[217926]: 2026-01-26 20:16:03.310981699 +0000 UTC m=+0.065525706 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 20:16:04 compute-0 nova_compute[183177]: 2026-01-26 20:16:04.753 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:05 compute-0 nova_compute[183177]: 2026-01-26 20:16:05.331 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:09 compute-0 nova_compute[183177]: 2026-01-26 20:16:09.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:16:09 compute-0 nova_compute[183177]: 2026-01-26 20:16:09.754 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:10 compute-0 nova_compute[183177]: 2026-01-26 20:16:10.333 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:14 compute-0 nova_compute[183177]: 2026-01-26 20:16:14.757 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:15 compute-0 nova_compute[183177]: 2026-01-26 20:16:15.335 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:17 compute-0 nova_compute[183177]: 2026-01-26 20:16:17.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:16:17 compute-0 sshd-session[217951]: Connection closed by authenticating user root 188.166.116.149 port 39354 [preauth]
Jan 26 20:16:19 compute-0 nova_compute[183177]: 2026-01-26 20:16:19.761 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:20 compute-0 nova_compute[183177]: 2026-01-26 20:16:20.337 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:21 compute-0 nova_compute[183177]: 2026-01-26 20:16:21.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:16:21 compute-0 ovn_controller[95396]: 2026-01-26T20:16:21Z|00260|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 26 20:16:21 compute-0 nova_compute[183177]: 2026-01-26 20:16:21.673 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:16:21 compute-0 nova_compute[183177]: 2026-01-26 20:16:21.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:16:21 compute-0 nova_compute[183177]: 2026-01-26 20:16:21.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:16:21 compute-0 nova_compute[183177]: 2026-01-26 20:16:21.675 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:16:21 compute-0 nova_compute[183177]: 2026-01-26 20:16:21.940 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:16:21 compute-0 nova_compute[183177]: 2026-01-26 20:16:21.942 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:16:21 compute-0 nova_compute[183177]: 2026-01-26 20:16:21.970 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:16:21 compute-0 nova_compute[183177]: 2026-01-26 20:16:21.971 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5663MB free_disk=73.08965682983398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:16:21 compute-0 nova_compute[183177]: 2026-01-26 20:16:21.972 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:16:21 compute-0 nova_compute[183177]: 2026-01-26 20:16:21.972 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:16:22 compute-0 sshd-session[217954]: Connection closed by authenticating user root 142.93.140.142 port 44298 [preauth]
Jan 26 20:16:23 compute-0 nova_compute[183177]: 2026-01-26 20:16:23.180 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:16:23 compute-0 nova_compute[183177]: 2026-01-26 20:16:23.180 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:16:21 up  1:40,  0 user,  load average: 0.72, 0.35, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:16:23 compute-0 nova_compute[183177]: 2026-01-26 20:16:23.200 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:16:23 compute-0 nova_compute[183177]: 2026-01-26 20:16:23.709 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:16:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:16:24.118 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:16:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:16:24.119 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:16:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:16:24.119 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:16:24 compute-0 nova_compute[183177]: 2026-01-26 20:16:24.330 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:16:24 compute-0 nova_compute[183177]: 2026-01-26 20:16:24.331 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.358s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:16:24 compute-0 nova_compute[183177]: 2026-01-26 20:16:24.765 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:25 compute-0 nova_compute[183177]: 2026-01-26 20:16:25.327 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:16:25 compute-0 nova_compute[183177]: 2026-01-26 20:16:25.327 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:16:25 compute-0 nova_compute[183177]: 2026-01-26 20:16:25.328 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:16:25 compute-0 nova_compute[183177]: 2026-01-26 20:16:25.328 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:16:25 compute-0 nova_compute[183177]: 2026-01-26 20:16:25.329 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:16:25 compute-0 nova_compute[183177]: 2026-01-26 20:16:25.339 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:28 compute-0 podman[217960]: 2026-01-26 20:16:28.337609273 +0000 UTC m=+0.075042974 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.expose-services=, version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container)
Jan 26 20:16:28 compute-0 podman[217961]: 2026-01-26 20:16:28.349853503 +0000 UTC m=+0.086983316 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 26 20:16:28 compute-0 podman[217959]: 2026-01-26 20:16:28.384505496 +0000 UTC m=+0.136004186 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260120)
Jan 26 20:16:29 compute-0 sshd-session[218024]: Invalid user hadoop from 193.32.162.151 port 54292
Jan 26 20:16:29 compute-0 podman[192499]: time="2026-01-26T20:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:16:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:16:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Jan 26 20:16:29 compute-0 nova_compute[183177]: 2026-01-26 20:16:29.803 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:29 compute-0 sshd-session[218024]: Connection closed by invalid user hadoop 193.32.162.151 port 54292 [preauth]
Jan 26 20:16:30 compute-0 nova_compute[183177]: 2026-01-26 20:16:30.340 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:31 compute-0 openstack_network_exporter[195363]: ERROR   20:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:16:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:16:31 compute-0 openstack_network_exporter[195363]: ERROR   20:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:16:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:16:32 compute-0 nova_compute[183177]: 2026-01-26 20:16:32.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:16:34 compute-0 nova_compute[183177]: 2026-01-26 20:16:34.150 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:16:34 compute-0 podman[218026]: 2026-01-26 20:16:34.313330267 +0000 UTC m=+0.066218847 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:16:34 compute-0 nova_compute[183177]: 2026-01-26 20:16:34.806 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:35 compute-0 nova_compute[183177]: 2026-01-26 20:16:35.344 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:39 compute-0 nova_compute[183177]: 2026-01-26 20:16:39.809 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:40 compute-0 nova_compute[183177]: 2026-01-26 20:16:40.392 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:44 compute-0 nova_compute[183177]: 2026-01-26 20:16:44.813 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:45 compute-0 nova_compute[183177]: 2026-01-26 20:16:45.394 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:49 compute-0 nova_compute[183177]: 2026-01-26 20:16:49.815 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:50 compute-0 nova_compute[183177]: 2026-01-26 20:16:50.396 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:53 compute-0 sshd-session[218051]: Connection closed by authenticating user root 188.166.116.149 port 45754 [preauth]
Jan 26 20:16:53 compute-0 sshd-session[218053]: Accepted publickey for zuul from 192.168.122.10 port 52630 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 20:16:53 compute-0 systemd-logind[794]: New session 28 of user zuul.
Jan 26 20:16:53 compute-0 systemd[1]: Started Session 28 of User zuul.
Jan 26 20:16:53 compute-0 sshd-session[218053]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 20:16:53 compute-0 sudo[218057]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 26 20:16:53 compute-0 sudo[218057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 20:16:54 compute-0 nova_compute[183177]: 2026-01-26 20:16:54.818 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:55 compute-0 nova_compute[183177]: 2026-01-26 20:16:55.434 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:16:57 compute-0 sshd-session[218199]: Connection closed by authenticating user root 142.93.140.142 port 33546 [preauth]
Jan 26 20:16:58 compute-0 ovs-vsctl[218232]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 26 20:16:59 compute-0 podman[218259]: 2026-01-26 20:16:59.356315891 +0000 UTC m=+0.079626477 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:16:59 compute-0 podman[218258]: 2026-01-26 20:16:59.357568855 +0000 UTC m=+0.087602532 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, release=1755695350)
Jan 26 20:16:59 compute-0 podman[218255]: 2026-01-26 20:16:59.391457408 +0000 UTC m=+0.123587001 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 20:16:59 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 218081 (sos)
Jan 26 20:16:59 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 26 20:16:59 compute-0 podman[192499]: time="2026-01-26T20:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:16:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:16:59 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 26 20:16:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2183 "" "Go-http-client/1.1"
Jan 26 20:16:59 compute-0 nova_compute[183177]: 2026-01-26 20:16:59.820 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:00 compute-0 virtqemud[182929]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 26 20:17:00 compute-0 virtqemud[182929]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 26 20:17:00 compute-0 virtqemud[182929]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 20:17:00 compute-0 nova_compute[183177]: 2026-01-26 20:17:00.500 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:01 compute-0 crontab[218698]: (root) LIST (root)
Jan 26 20:17:01 compute-0 openstack_network_exporter[195363]: ERROR   20:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:17:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:17:01 compute-0 openstack_network_exporter[195363]: ERROR   20:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:17:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:17:03 compute-0 systemd[1]: Starting Hostname Service...
Jan 26 20:17:03 compute-0 systemd[1]: Started Hostname Service.
Jan 26 20:17:04 compute-0 nova_compute[183177]: 2026-01-26 20:17:04.824 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:05 compute-0 podman[218950]: 2026-01-26 20:17:05.002661798 +0000 UTC m=+0.059835633 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:17:05 compute-0 nova_compute[183177]: 2026-01-26 20:17:05.500 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:09 compute-0 nova_compute[183177]: 2026-01-26 20:17:09.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:17:09 compute-0 nova_compute[183177]: 2026-01-26 20:17:09.827 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:10 compute-0 nova_compute[183177]: 2026-01-26 20:17:10.502 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:10 compute-0 ovs-appctl[219972]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 26 20:17:10 compute-0 ovs-appctl[219979]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 26 20:17:10 compute-0 ovs-appctl[219985]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 26 20:17:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck2293642295-merged.mount: Deactivated successfully.
Jan 26 20:17:14 compute-0 nova_compute[183177]: 2026-01-26 20:17:14.830 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:15 compute-0 nova_compute[183177]: 2026-01-26 20:17:15.503 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:17 compute-0 virtqemud[182929]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 20:17:19 compute-0 nova_compute[183177]: 2026-01-26 20:17:19.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:17:19 compute-0 nova_compute[183177]: 2026-01-26 20:17:19.835 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:20 compute-0 systemd[1]: Starting Time & Date Service...
Jan 26 20:17:20 compute-0 nova_compute[183177]: 2026-01-26 20:17:20.542 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:20 compute-0 systemd[1]: Started Time & Date Service.
Jan 26 20:17:21 compute-0 nova_compute[183177]: 2026-01-26 20:17:21.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:17:21 compute-0 nova_compute[183177]: 2026-01-26 20:17:21.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:17:21 compute-0 nova_compute[183177]: 2026-01-26 20:17:21.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:17:21 compute-0 nova_compute[183177]: 2026-01-26 20:17:21.670 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:17:21 compute-0 nova_compute[183177]: 2026-01-26 20:17:21.670 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:17:21 compute-0 nova_compute[183177]: 2026-01-26 20:17:21.890 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:17:21 compute-0 nova_compute[183177]: 2026-01-26 20:17:21.893 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:17:21 compute-0 nova_compute[183177]: 2026-01-26 20:17:21.932 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:17:21 compute-0 nova_compute[183177]: 2026-01-26 20:17:21.933 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5415MB free_disk=72.6140022277832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:17:21 compute-0 nova_compute[183177]: 2026-01-26 20:17:21.933 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:17:21 compute-0 nova_compute[183177]: 2026-01-26 20:17:21.934 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:17:23 compute-0 nova_compute[183177]: 2026-01-26 20:17:23.026 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:17:23 compute-0 nova_compute[183177]: 2026-01-26 20:17:23.027 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:17:21 up  1:41,  0 user,  load average: 1.25, 0.52, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:17:23 compute-0 nova_compute[183177]: 2026-01-26 20:17:23.064 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:17:23 compute-0 nova_compute[183177]: 2026-01-26 20:17:23.827 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:17:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:17:24.121 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:17:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:17:24.121 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:17:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:17:24.121 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:17:24 compute-0 nova_compute[183177]: 2026-01-26 20:17:24.340 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:17:24 compute-0 nova_compute[183177]: 2026-01-26 20:17:24.341 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.407s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:17:24 compute-0 nova_compute[183177]: 2026-01-26 20:17:24.841 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:25 compute-0 nova_compute[183177]: 2026-01-26 20:17:25.544 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:26 compute-0 nova_compute[183177]: 2026-01-26 20:17:26.341 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:17:26 compute-0 nova_compute[183177]: 2026-01-26 20:17:26.342 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:17:26 compute-0 nova_compute[183177]: 2026-01-26 20:17:26.342 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:17:26 compute-0 nova_compute[183177]: 2026-01-26 20:17:26.342 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:17:27 compute-0 nova_compute[183177]: 2026-01-26 20:17:27.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:17:29 compute-0 podman[221357]: 2026-01-26 20:17:29.484045567 +0000 UTC m=+0.080985724 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 26 20:17:29 compute-0 podman[221358]: 2026-01-26 20:17:29.484930082 +0000 UTC m=+0.080134411 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 20:17:29 compute-0 podman[221361]: 2026-01-26 20:17:29.552881542 +0000 UTC m=+0.128635577 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 26 20:17:29 compute-0 podman[192499]: time="2026-01-26T20:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:17:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:17:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2200 "" "Go-http-client/1.1"
Jan 26 20:17:29 compute-0 nova_compute[183177]: 2026-01-26 20:17:29.888 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:30 compute-0 nova_compute[183177]: 2026-01-26 20:17:30.545 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:30 compute-0 sshd-session[221421]: Connection closed by authenticating user root 188.166.116.149 port 39888 [preauth]
Jan 26 20:17:31 compute-0 openstack_network_exporter[195363]: ERROR   20:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:17:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:17:31 compute-0 openstack_network_exporter[195363]: ERROR   20:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:17:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:17:32 compute-0 sshd-session[221424]: Connection closed by authenticating user root 142.93.140.142 port 44892 [preauth]
Jan 26 20:17:34 compute-0 nova_compute[183177]: 2026-01-26 20:17:34.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:17:34 compute-0 nova_compute[183177]: 2026-01-26 20:17:34.890 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:35 compute-0 podman[221426]: 2026-01-26 20:17:35.12833398 +0000 UTC m=+0.075808415 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:17:35 compute-0 nova_compute[183177]: 2026-01-26 20:17:35.546 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:39 compute-0 sudo[218057]: pam_unix(sudo:session): session closed for user root
Jan 26 20:17:39 compute-0 sshd-session[218056]: Received disconnect from 192.168.122.10 port 52630:11: disconnected by user
Jan 26 20:17:39 compute-0 sshd-session[218056]: Disconnected from user zuul 192.168.122.10 port 52630
Jan 26 20:17:39 compute-0 sshd-session[218053]: pam_unix(sshd:session): session closed for user zuul
Jan 26 20:17:39 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Jan 26 20:17:39 compute-0 systemd[1]: session-28.scope: Consumed 1min 15.620s CPU time, 512.0M memory peak, read 125.3M from disk, written 23.2M to disk.
Jan 26 20:17:39 compute-0 systemd-logind[794]: Session 28 logged out. Waiting for processes to exit.
Jan 26 20:17:39 compute-0 systemd-logind[794]: Removed session 28.
Jan 26 20:17:39 compute-0 sshd-session[221449]: Accepted publickey for zuul from 192.168.122.10 port 32946 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 20:17:39 compute-0 systemd-logind[794]: New session 29 of user zuul.
Jan 26 20:17:39 compute-0 systemd[1]: Started Session 29 of User zuul.
Jan 26 20:17:39 compute-0 sshd-session[221449]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 20:17:39 compute-0 nova_compute[183177]: 2026-01-26 20:17:39.893 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:39 compute-0 sudo[221453]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2026-01-26-ibfrcsm.tar.xz
Jan 26 20:17:39 compute-0 sudo[221453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 20:17:40 compute-0 sudo[221453]: pam_unix(sudo:session): session closed for user root
Jan 26 20:17:40 compute-0 sshd-session[221452]: Received disconnect from 192.168.122.10 port 32946:11: disconnected by user
Jan 26 20:17:40 compute-0 sshd-session[221452]: Disconnected from user zuul 192.168.122.10 port 32946
Jan 26 20:17:40 compute-0 sshd-session[221449]: pam_unix(sshd:session): session closed for user zuul
Jan 26 20:17:40 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Jan 26 20:17:40 compute-0 systemd-logind[794]: Session 29 logged out. Waiting for processes to exit.
Jan 26 20:17:40 compute-0 systemd-logind[794]: Removed session 29.
Jan 26 20:17:40 compute-0 sshd-session[221478]: Accepted publickey for zuul from 192.168.122.10 port 32962 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 20:17:40 compute-0 systemd-logind[794]: New session 30 of user zuul.
Jan 26 20:17:40 compute-0 systemd[1]: Started Session 30 of User zuul.
Jan 26 20:17:40 compute-0 sshd-session[221478]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 20:17:40 compute-0 sudo[221482]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Jan 26 20:17:40 compute-0 sudo[221482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 20:17:40 compute-0 sudo[221482]: pam_unix(sudo:session): session closed for user root
Jan 26 20:17:40 compute-0 sshd-session[221481]: Received disconnect from 192.168.122.10 port 32962:11: disconnected by user
Jan 26 20:17:40 compute-0 sshd-session[221481]: Disconnected from user zuul 192.168.122.10 port 32962
Jan 26 20:17:40 compute-0 sshd-session[221478]: pam_unix(sshd:session): session closed for user zuul
Jan 26 20:17:40 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Jan 26 20:17:40 compute-0 systemd-logind[794]: Session 30 logged out. Waiting for processes to exit.
Jan 26 20:17:40 compute-0 systemd-logind[794]: Removed session 30.
Jan 26 20:17:40 compute-0 nova_compute[183177]: 2026-01-26 20:17:40.548 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:44 compute-0 nova_compute[183177]: 2026-01-26 20:17:44.896 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:45 compute-0 nova_compute[183177]: 2026-01-26 20:17:45.550 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:49 compute-0 nova_compute[183177]: 2026-01-26 20:17:49.897 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:50 compute-0 nova_compute[183177]: 2026-01-26 20:17:50.592 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:50 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 26 20:17:50 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 26 20:17:54 compute-0 nova_compute[183177]: 2026-01-26 20:17:54.899 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:55 compute-0 nova_compute[183177]: 2026-01-26 20:17:55.632 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:17:59 compute-0 podman[192499]: time="2026-01-26T20:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:17:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:17:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2204 "" "Go-http-client/1.1"
Jan 26 20:17:59 compute-0 nova_compute[183177]: 2026-01-26 20:17:59.901 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:00 compute-0 podman[221513]: 2026-01-26 20:18:00.353372391 +0000 UTC m=+0.097987902 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 20:18:00 compute-0 podman[221514]: 2026-01-26 20:18:00.362626139 +0000 UTC m=+0.095348700 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:18:00 compute-0 podman[221512]: 2026-01-26 20:18:00.394369815 +0000 UTC m=+0.142580784 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 26 20:18:00 compute-0 nova_compute[183177]: 2026-01-26 20:18:00.634 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:01 compute-0 openstack_network_exporter[195363]: ERROR   20:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:18:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:18:01 compute-0 openstack_network_exporter[195363]: ERROR   20:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:18:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:18:04 compute-0 nova_compute[183177]: 2026-01-26 20:18:04.921 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:05 compute-0 podman[221576]: 2026-01-26 20:18:05.336008199 +0000 UTC m=+0.081932738 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 20:18:05 compute-0 nova_compute[183177]: 2026-01-26 20:18:05.637 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:07 compute-0 sshd-session[221600]: Connection closed by authenticating user root 142.93.140.142 port 53028 [preauth]
Jan 26 20:18:09 compute-0 sshd-session[221602]: Connection closed by authenticating user root 188.166.116.149 port 44780 [preauth]
Jan 26 20:18:09 compute-0 nova_compute[183177]: 2026-01-26 20:18:09.955 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:10 compute-0 nova_compute[183177]: 2026-01-26 20:18:10.639 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:11 compute-0 nova_compute[183177]: 2026-01-26 20:18:11.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:18:15 compute-0 nova_compute[183177]: 2026-01-26 20:18:15.015 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:15 compute-0 nova_compute[183177]: 2026-01-26 20:18:15.641 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:16 compute-0 nova_compute[183177]: 2026-01-26 20:18:16.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:18:16 compute-0 nova_compute[183177]: 2026-01-26 20:18:16.153 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 20:18:18 compute-0 nova_compute[183177]: 2026-01-26 20:18:18.659 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:18:18 compute-0 nova_compute[183177]: 2026-01-26 20:18:18.660 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 20:18:19 compute-0 nova_compute[183177]: 2026-01-26 20:18:19.169 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 20:18:19 compute-0 nova_compute[183177]: 2026-01-26 20:18:19.662 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:18:20 compute-0 nova_compute[183177]: 2026-01-26 20:18:20.017 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:20 compute-0 nova_compute[183177]: 2026-01-26 20:18:20.643 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:21 compute-0 nova_compute[183177]: 2026-01-26 20:18:21.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:18:21 compute-0 nova_compute[183177]: 2026-01-26 20:18:21.678 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:18:21 compute-0 nova_compute[183177]: 2026-01-26 20:18:21.679 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:18:21 compute-0 nova_compute[183177]: 2026-01-26 20:18:21.679 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:18:21 compute-0 nova_compute[183177]: 2026-01-26 20:18:21.680 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:18:21 compute-0 nova_compute[183177]: 2026-01-26 20:18:21.917 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:18:21 compute-0 nova_compute[183177]: 2026-01-26 20:18:21.919 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:18:21 compute-0 nova_compute[183177]: 2026-01-26 20:18:21.963 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:18:21 compute-0 nova_compute[183177]: 2026-01-26 20:18:21.964 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5517MB free_disk=73.08947372436523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:18:21 compute-0 nova_compute[183177]: 2026-01-26 20:18:21.964 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:18:21 compute-0 nova_compute[183177]: 2026-01-26 20:18:21.965 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:18:23 compute-0 nova_compute[183177]: 2026-01-26 20:18:23.086 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:18:23 compute-0 nova_compute[183177]: 2026-01-26 20:18:23.086 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:18:21 up  1:42,  0 user,  load average: 0.69, 0.49, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:18:23 compute-0 nova_compute[183177]: 2026-01-26 20:18:23.112 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing inventories for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 20:18:23 compute-0 nova_compute[183177]: 2026-01-26 20:18:23.129 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating ProviderTree inventory for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 20:18:23 compute-0 nova_compute[183177]: 2026-01-26 20:18:23.130 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 20:18:23 compute-0 nova_compute[183177]: 2026-01-26 20:18:23.153 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing aggregate associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 20:18:23 compute-0 nova_compute[183177]: 2026-01-26 20:18:23.183 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing trait associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_SCSI,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 20:18:23 compute-0 nova_compute[183177]: 2026-01-26 20:18:23.207 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:18:23 compute-0 nova_compute[183177]: 2026-01-26 20:18:23.750 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:18:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:18:24.123 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:18:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:18:24.123 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:18:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:18:24.123 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:18:24 compute-0 nova_compute[183177]: 2026-01-26 20:18:24.263 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:18:24 compute-0 nova_compute[183177]: 2026-01-26 20:18:24.263 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.299s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:18:25 compute-0 nova_compute[183177]: 2026-01-26 20:18:25.020 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:25 compute-0 nova_compute[183177]: 2026-01-26 20:18:25.646 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:26 compute-0 nova_compute[183177]: 2026-01-26 20:18:26.264 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:18:26 compute-0 nova_compute[183177]: 2026-01-26 20:18:26.265 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:18:26 compute-0 nova_compute[183177]: 2026-01-26 20:18:26.265 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:18:26 compute-0 nova_compute[183177]: 2026-01-26 20:18:26.265 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:18:28 compute-0 nova_compute[183177]: 2026-01-26 20:18:28.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:18:29 compute-0 podman[192499]: time="2026-01-26T20:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:18:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:18:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2198 "" "Go-http-client/1.1"
Jan 26 20:18:30 compute-0 nova_compute[183177]: 2026-01-26 20:18:30.046 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:30 compute-0 nova_compute[183177]: 2026-01-26 20:18:30.648 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:31 compute-0 podman[221608]: 2026-01-26 20:18:31.374677069 +0000 UTC m=+0.106821869 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 26 20:18:31 compute-0 podman[221607]: 2026-01-26 20:18:31.390866606 +0000 UTC m=+0.128773672 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, vcs-type=git, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Jan 26 20:18:31 compute-0 podman[221606]: 2026-01-26 20:18:31.400856205 +0000 UTC m=+0.141932966 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 20:18:31 compute-0 openstack_network_exporter[195363]: ERROR   20:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:18:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:18:31 compute-0 openstack_network_exporter[195363]: ERROR   20:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:18:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:18:34 compute-0 nova_compute[183177]: 2026-01-26 20:18:34.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:18:35 compute-0 nova_compute[183177]: 2026-01-26 20:18:35.049 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:35 compute-0 nova_compute[183177]: 2026-01-26 20:18:35.651 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:36 compute-0 podman[221665]: 2026-01-26 20:18:36.30577385 +0000 UTC m=+0.061469748 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:18:38 compute-0 nova_compute[183177]: 2026-01-26 20:18:38.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:18:39 compute-0 nova_compute[183177]: 2026-01-26 20:18:39.658 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:18:40 compute-0 nova_compute[183177]: 2026-01-26 20:18:40.053 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:40 compute-0 nova_compute[183177]: 2026-01-26 20:18:40.654 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:43 compute-0 sshd-session[221689]: Invalid user hadoop from 193.32.162.151 port 59914
Jan 26 20:18:43 compute-0 sshd-session[221691]: Connection closed by authenticating user root 142.93.140.142 port 48624 [preauth]
Jan 26 20:18:43 compute-0 sshd-session[221689]: Connection closed by invalid user hadoop 193.32.162.151 port 59914 [preauth]
Jan 26 20:18:45 compute-0 nova_compute[183177]: 2026-01-26 20:18:45.094 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:45 compute-0 nova_compute[183177]: 2026-01-26 20:18:45.657 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:47 compute-0 sshd-session[221693]: Connection closed by authenticating user root 188.166.116.149 port 33982 [preauth]
Jan 26 20:18:50 compute-0 nova_compute[183177]: 2026-01-26 20:18:50.098 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:50 compute-0 nova_compute[183177]: 2026-01-26 20:18:50.658 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:55 compute-0 nova_compute[183177]: 2026-01-26 20:18:55.101 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:55 compute-0 nova_compute[183177]: 2026-01-26 20:18:55.661 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:18:59 compute-0 podman[192499]: time="2026-01-26T20:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:18:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:18:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2199 "" "Go-http-client/1.1"
Jan 26 20:19:00 compute-0 nova_compute[183177]: 2026-01-26 20:19:00.103 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:00 compute-0 nova_compute[183177]: 2026-01-26 20:19:00.696 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:01 compute-0 openstack_network_exporter[195363]: ERROR   20:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:19:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:19:01 compute-0 openstack_network_exporter[195363]: ERROR   20:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:19:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:19:02 compute-0 podman[221697]: 2026-01-26 20:19:02.321898784 +0000 UTC m=+0.058043285 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 20:19:02 compute-0 podman[221696]: 2026-01-26 20:19:02.336684943 +0000 UTC m=+0.081281682 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, build-date=2025-08-20T13:12:41)
Jan 26 20:19:02 compute-0 podman[221695]: 2026-01-26 20:19:02.351884352 +0000 UTC m=+0.101523647 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 26 20:19:05 compute-0 nova_compute[183177]: 2026-01-26 20:19:05.107 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:05 compute-0 nova_compute[183177]: 2026-01-26 20:19:05.697 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:07 compute-0 podman[221758]: 2026-01-26 20:19:07.324999524 +0000 UTC m=+0.069558325 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:19:10 compute-0 nova_compute[183177]: 2026-01-26 20:19:10.110 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:10 compute-0 nova_compute[183177]: 2026-01-26 20:19:10.700 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:12 compute-0 nova_compute[183177]: 2026-01-26 20:19:12.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:19:15 compute-0 nova_compute[183177]: 2026-01-26 20:19:15.114 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:15 compute-0 nova_compute[183177]: 2026-01-26 20:19:15.702 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:20 compute-0 nova_compute[183177]: 2026-01-26 20:19:20.116 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:20 compute-0 nova_compute[183177]: 2026-01-26 20:19:20.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:19:20 compute-0 nova_compute[183177]: 2026-01-26 20:19:20.752 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:21 compute-0 sshd-session[221783]: Connection closed by authenticating user root 142.93.140.142 port 49664 [preauth]
Jan 26 20:19:22 compute-0 nova_compute[183177]: 2026-01-26 20:19:22.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:19:22 compute-0 nova_compute[183177]: 2026-01-26 20:19:22.680 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:19:22 compute-0 nova_compute[183177]: 2026-01-26 20:19:22.680 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:19:22 compute-0 nova_compute[183177]: 2026-01-26 20:19:22.681 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:19:22 compute-0 nova_compute[183177]: 2026-01-26 20:19:22.681 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:19:22 compute-0 nova_compute[183177]: 2026-01-26 20:19:22.836 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:19:22 compute-0 nova_compute[183177]: 2026-01-26 20:19:22.837 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:19:22 compute-0 nova_compute[183177]: 2026-01-26 20:19:22.876 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:19:22 compute-0 nova_compute[183177]: 2026-01-26 20:19:22.877 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5626MB free_disk=73.08947372436523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:19:22 compute-0 nova_compute[183177]: 2026-01-26 20:19:22.877 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:19:22 compute-0 nova_compute[183177]: 2026-01-26 20:19:22.877 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:19:23 compute-0 nova_compute[183177]: 2026-01-26 20:19:23.929 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:19:23 compute-0 nova_compute[183177]: 2026-01-26 20:19:23.929 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:19:22 up  1:43,  0 user,  load average: 0.56, 0.49, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:19:24 compute-0 nova_compute[183177]: 2026-01-26 20:19:24.024 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:19:24.123 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:19:24.124 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:19:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:19:24.124 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:19:24 compute-0 nova_compute[183177]: 2026-01-26 20:19:24.547 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:19:25 compute-0 nova_compute[183177]: 2026-01-26 20:19:25.059 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:19:25 compute-0 nova_compute[183177]: 2026-01-26 20:19:25.060 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.183s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:19:25 compute-0 nova_compute[183177]: 2026-01-26 20:19:25.169 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:25 compute-0 nova_compute[183177]: 2026-01-26 20:19:25.753 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:27 compute-0 sshd-session[221787]: Connection closed by authenticating user root 188.166.116.149 port 33472 [preauth]
Jan 26 20:19:28 compute-0 nova_compute[183177]: 2026-01-26 20:19:28.060 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:19:28 compute-0 nova_compute[183177]: 2026-01-26 20:19:28.061 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:19:28 compute-0 nova_compute[183177]: 2026-01-26 20:19:28.062 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:19:28 compute-0 nova_compute[183177]: 2026-01-26 20:19:28.150 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:19:29 compute-0 podman[192499]: time="2026-01-26T20:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:19:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:19:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2201 "" "Go-http-client/1.1"
Jan 26 20:19:30 compute-0 nova_compute[183177]: 2026-01-26 20:19:30.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:19:30 compute-0 nova_compute[183177]: 2026-01-26 20:19:30.171 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:30 compute-0 nova_compute[183177]: 2026-01-26 20:19:30.755 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:32 compute-0 openstack_network_exporter[195363]: ERROR   20:19:32 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:19:32 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:19:32 compute-0 openstack_network_exporter[195363]: ERROR   20:19:32 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:19:32 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:19:32 compute-0 nova_compute[183177]: 2026-01-26 20:19:32.742 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:19:33 compute-0 podman[221791]: 2026-01-26 20:19:33.340967883 +0000 UTC m=+0.081759074 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 20:19:33 compute-0 podman[221789]: 2026-01-26 20:19:33.340986903 +0000 UTC m=+0.090301383 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 20:19:33 compute-0 podman[221790]: 2026-01-26 20:19:33.361736173 +0000 UTC m=+0.093477260 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 26 20:19:35 compute-0 nova_compute[183177]: 2026-01-26 20:19:35.174 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:35 compute-0 nova_compute[183177]: 2026-01-26 20:19:35.756 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:36 compute-0 nova_compute[183177]: 2026-01-26 20:19:36.663 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:19:38 compute-0 podman[221852]: 2026-01-26 20:19:38.330397946 +0000 UTC m=+0.074920121 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 20:19:40 compute-0 nova_compute[183177]: 2026-01-26 20:19:40.178 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:40 compute-0 nova_compute[183177]: 2026-01-26 20:19:40.796 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:45 compute-0 nova_compute[183177]: 2026-01-26 20:19:45.181 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:45 compute-0 nova_compute[183177]: 2026-01-26 20:19:45.799 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:50 compute-0 nova_compute[183177]: 2026-01-26 20:19:50.220 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:50 compute-0 nova_compute[183177]: 2026-01-26 20:19:50.799 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:55 compute-0 nova_compute[183177]: 2026-01-26 20:19:55.223 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:55 compute-0 nova_compute[183177]: 2026-01-26 20:19:55.801 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:19:57 compute-0 sshd-session[221876]: Connection closed by authenticating user root 142.93.140.142 port 55614 [preauth]
Jan 26 20:19:59 compute-0 podman[192499]: time="2026-01-26T20:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:19:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:19:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2201 "" "Go-http-client/1.1"
Jan 26 20:20:00 compute-0 nova_compute[183177]: 2026-01-26 20:20:00.228 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:00 compute-0 nova_compute[183177]: 2026-01-26 20:20:00.803 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:01 compute-0 openstack_network_exporter[195363]: ERROR   20:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:20:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:20:01 compute-0 openstack_network_exporter[195363]: ERROR   20:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:20:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:20:04 compute-0 podman[221880]: 2026-01-26 20:20:04.336811148 +0000 UTC m=+0.056416041 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260120, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 26 20:20:04 compute-0 podman[221879]: 2026-01-26 20:20:04.356346154 +0000 UTC m=+0.082391541 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 20:20:04 compute-0 podman[221878]: 2026-01-26 20:20:04.383911037 +0000 UTC m=+0.119708017 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 26 20:20:05 compute-0 nova_compute[183177]: 2026-01-26 20:20:05.269 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:05 compute-0 sshd-session[221944]: Connection closed by authenticating user root 188.166.116.149 port 47042 [preauth]
Jan 26 20:20:05 compute-0 nova_compute[183177]: 2026-01-26 20:20:05.802 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:09 compute-0 podman[221946]: 2026-01-26 20:20:09.362650602 +0000 UTC m=+0.096098022 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 20:20:10 compute-0 nova_compute[183177]: 2026-01-26 20:20:10.271 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:10 compute-0 nova_compute[183177]: 2026-01-26 20:20:10.804 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:13 compute-0 nova_compute[183177]: 2026-01-26 20:20:13.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:20:15 compute-0 nova_compute[183177]: 2026-01-26 20:20:15.275 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:15 compute-0 nova_compute[183177]: 2026-01-26 20:20:15.851 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:20 compute-0 nova_compute[183177]: 2026-01-26 20:20:20.278 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:20 compute-0 nova_compute[183177]: 2026-01-26 20:20:20.853 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:22 compute-0 nova_compute[183177]: 2026-01-26 20:20:22.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:20:23 compute-0 nova_compute[183177]: 2026-01-26 20:20:23.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:20:23 compute-0 nova_compute[183177]: 2026-01-26 20:20:23.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:20:23 compute-0 nova_compute[183177]: 2026-01-26 20:20:23.670 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:20:23 compute-0 nova_compute[183177]: 2026-01-26 20:20:23.670 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:20:23 compute-0 nova_compute[183177]: 2026-01-26 20:20:23.670 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:20:23 compute-0 nova_compute[183177]: 2026-01-26 20:20:23.817 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:20:23 compute-0 nova_compute[183177]: 2026-01-26 20:20:23.818 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:20:23 compute-0 nova_compute[183177]: 2026-01-26 20:20:23.852 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:20:23 compute-0 nova_compute[183177]: 2026-01-26 20:20:23.853 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5672MB free_disk=73.0895767211914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:20:23 compute-0 nova_compute[183177]: 2026-01-26 20:20:23.853 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:20:23 compute-0 nova_compute[183177]: 2026-01-26 20:20:23.854 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:20:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:20:24.125 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:20:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:20:24.125 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:20:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:20:24.125 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:20:24 compute-0 nova_compute[183177]: 2026-01-26 20:20:24.918 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:20:24 compute-0 nova_compute[183177]: 2026-01-26 20:20:24.918 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:20:23 up  1:44,  0 user,  load average: 0.31, 0.44, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:20:24 compute-0 nova_compute[183177]: 2026-01-26 20:20:24.980 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:20:25 compute-0 nova_compute[183177]: 2026-01-26 20:20:25.319 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:25 compute-0 nova_compute[183177]: 2026-01-26 20:20:25.491 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:20:25 compute-0 nova_compute[183177]: 2026-01-26 20:20:25.853 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:26 compute-0 nova_compute[183177]: 2026-01-26 20:20:26.002 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:20:26 compute-0 nova_compute[183177]: 2026-01-26 20:20:26.003 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.149s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:20:29 compute-0 podman[192499]: time="2026-01-26T20:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:20:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:20:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2206 "" "Go-http-client/1.1"
Jan 26 20:20:30 compute-0 nova_compute[183177]: 2026-01-26 20:20:30.004 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:20:30 compute-0 nova_compute[183177]: 2026-01-26 20:20:30.004 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:20:30 compute-0 nova_compute[183177]: 2026-01-26 20:20:30.005 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:20:30 compute-0 nova_compute[183177]: 2026-01-26 20:20:30.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:20:30 compute-0 nova_compute[183177]: 2026-01-26 20:20:30.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:20:30 compute-0 nova_compute[183177]: 2026-01-26 20:20:30.321 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:30 compute-0 nova_compute[183177]: 2026-01-26 20:20:30.855 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:31 compute-0 openstack_network_exporter[195363]: ERROR   20:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:20:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:20:31 compute-0 openstack_network_exporter[195363]: ERROR   20:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:20:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:20:34 compute-0 sshd-session[221972]: Connection closed by authenticating user root 142.93.140.142 port 42412 [preauth]
Jan 26 20:20:35 compute-0 nova_compute[183177]: 2026-01-26 20:20:35.323 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:35 compute-0 podman[221976]: 2026-01-26 20:20:35.335833549 +0000 UTC m=+0.076136741 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260120, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 26 20:20:35 compute-0 podman[221975]: 2026-01-26 20:20:35.369906586 +0000 UTC m=+0.112811398 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Jan 26 20:20:35 compute-0 podman[221974]: 2026-01-26 20:20:35.387833739 +0000 UTC m=+0.130744001 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 26 20:20:35 compute-0 nova_compute[183177]: 2026-01-26 20:20:35.856 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:37 compute-0 nova_compute[183177]: 2026-01-26 20:20:37.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:20:40 compute-0 nova_compute[183177]: 2026-01-26 20:20:40.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:20:40 compute-0 podman[222035]: 2026-01-26 20:20:40.305285718 +0000 UTC m=+0.062080103 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 20:20:40 compute-0 nova_compute[183177]: 2026-01-26 20:20:40.326 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:40 compute-0 nova_compute[183177]: 2026-01-26 20:20:40.903 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:45 compute-0 nova_compute[183177]: 2026-01-26 20:20:45.329 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:45 compute-0 nova_compute[183177]: 2026-01-26 20:20:45.954 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:46 compute-0 sshd-session[222059]: Connection closed by authenticating user root 188.166.116.149 port 58690 [preauth]
Jan 26 20:20:50 compute-0 nova_compute[183177]: 2026-01-26 20:20:50.331 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:50 compute-0 nova_compute[183177]: 2026-01-26 20:20:50.988 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:55 compute-0 nova_compute[183177]: 2026-01-26 20:20:55.335 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:55 compute-0 nova_compute[183177]: 2026-01-26 20:20:55.989 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:20:59 compute-0 podman[192499]: time="2026-01-26T20:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:20:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:20:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2207 "" "Go-http-client/1.1"
Jan 26 20:21:00 compute-0 nova_compute[183177]: 2026-01-26 20:21:00.339 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:00 compute-0 nova_compute[183177]: 2026-01-26 20:21:00.991 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:01 compute-0 openstack_network_exporter[195363]: ERROR   20:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:21:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:21:01 compute-0 openstack_network_exporter[195363]: ERROR   20:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:21:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:21:03 compute-0 sshd-session[222061]: Invalid user hadoop from 193.32.162.151 port 37298
Jan 26 20:21:03 compute-0 sshd-session[222061]: Connection closed by invalid user hadoop 193.32.162.151 port 37298 [preauth]
Jan 26 20:21:05 compute-0 nova_compute[183177]: 2026-01-26 20:21:05.341 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:05 compute-0 nova_compute[183177]: 2026-01-26 20:21:05.994 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:06 compute-0 podman[222065]: 2026-01-26 20:21:06.313051333 +0000 UTC m=+0.060135070 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260120, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 26 20:21:06 compute-0 podman[222064]: 2026-01-26 20:21:06.325094538 +0000 UTC m=+0.072189645 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, name=ubi9-minimal)
Jan 26 20:21:06 compute-0 podman[222063]: 2026-01-26 20:21:06.341811348 +0000 UTC m=+0.097288561 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 26 20:21:10 compute-0 nova_compute[183177]: 2026-01-26 20:21:10.345 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:11 compute-0 nova_compute[183177]: 2026-01-26 20:21:11.000 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:11 compute-0 podman[222130]: 2026-01-26 20:21:11.348927281 +0000 UTC m=+0.091012383 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 20:21:11 compute-0 sshd-session[222128]: Connection closed by authenticating user root 142.93.140.142 port 58138 [preauth]
Jan 26 20:21:13 compute-0 nova_compute[183177]: 2026-01-26 20:21:13.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:21:15 compute-0 nova_compute[183177]: 2026-01-26 20:21:15.346 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:16 compute-0 nova_compute[183177]: 2026-01-26 20:21:16.003 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:20 compute-0 nova_compute[183177]: 2026-01-26 20:21:20.348 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:21 compute-0 nova_compute[183177]: 2026-01-26 20:21:21.006 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:23 compute-0 nova_compute[183177]: 2026-01-26 20:21:23.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:21:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:21:24.126 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:21:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:21:24.127 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:21:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:21:24.127 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:21:24 compute-0 nova_compute[183177]: 2026-01-26 20:21:24.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:21:24 compute-0 nova_compute[183177]: 2026-01-26 20:21:24.673 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:21:24 compute-0 nova_compute[183177]: 2026-01-26 20:21:24.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:21:24 compute-0 nova_compute[183177]: 2026-01-26 20:21:24.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:21:24 compute-0 nova_compute[183177]: 2026-01-26 20:21:24.675 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:21:24 compute-0 nova_compute[183177]: 2026-01-26 20:21:24.898 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:21:24 compute-0 nova_compute[183177]: 2026-01-26 20:21:24.900 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:21:24 compute-0 nova_compute[183177]: 2026-01-26 20:21:24.932 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:21:24 compute-0 nova_compute[183177]: 2026-01-26 20:21:24.933 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5698MB free_disk=73.0895767211914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:21:24 compute-0 nova_compute[183177]: 2026-01-26 20:21:24.934 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:21:24 compute-0 nova_compute[183177]: 2026-01-26 20:21:24.935 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:21:25 compute-0 nova_compute[183177]: 2026-01-26 20:21:25.352 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:26 compute-0 nova_compute[183177]: 2026-01-26 20:21:26.004 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:21:26 compute-0 nova_compute[183177]: 2026-01-26 20:21:26.004 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:21:24 up  1:45,  0 user,  load average: 0.22, 0.39, 0.34\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:21:26 compute-0 nova_compute[183177]: 2026-01-26 20:21:26.041 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:26 compute-0 nova_compute[183177]: 2026-01-26 20:21:26.061 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:21:26 compute-0 sshd-session[222156]: Connection closed by authenticating user root 188.166.116.149 port 41694 [preauth]
Jan 26 20:21:26 compute-0 nova_compute[183177]: 2026-01-26 20:21:26.571 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:21:27 compute-0 nova_compute[183177]: 2026-01-26 20:21:27.086 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:21:27 compute-0 nova_compute[183177]: 2026-01-26 20:21:27.087 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.152s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:21:29 compute-0 podman[192499]: time="2026-01-26T20:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:21:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:21:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2204 "" "Go-http-client/1.1"
Jan 26 20:21:30 compute-0 nova_compute[183177]: 2026-01-26 20:21:30.384 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:31 compute-0 nova_compute[183177]: 2026-01-26 20:21:31.044 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:31 compute-0 nova_compute[183177]: 2026-01-26 20:21:31.087 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:21:31 compute-0 nova_compute[183177]: 2026-01-26 20:21:31.087 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:21:31 compute-0 nova_compute[183177]: 2026-01-26 20:21:31.088 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:21:31 compute-0 nova_compute[183177]: 2026-01-26 20:21:31.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:21:31 compute-0 openstack_network_exporter[195363]: ERROR   20:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:21:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:21:31 compute-0 openstack_network_exporter[195363]: ERROR   20:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:21:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:21:32 compute-0 nova_compute[183177]: 2026-01-26 20:21:32.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:21:35 compute-0 nova_compute[183177]: 2026-01-26 20:21:35.387 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:36 compute-0 nova_compute[183177]: 2026-01-26 20:21:36.044 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:37 compute-0 podman[222159]: 2026-01-26 20:21:37.302798582 +0000 UTC m=+0.056253556 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 26 20:21:37 compute-0 podman[222160]: 2026-01-26 20:21:37.325926785 +0000 UTC m=+0.075574306 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 26 20:21:37 compute-0 podman[222158]: 2026-01-26 20:21:37.338765901 +0000 UTC m=+0.089841281 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260120)
Jan 26 20:21:39 compute-0 nova_compute[183177]: 2026-01-26 20:21:39.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:21:40 compute-0 nova_compute[183177]: 2026-01-26 20:21:40.390 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:41 compute-0 nova_compute[183177]: 2026-01-26 20:21:41.046 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:42 compute-0 podman[222223]: 2026-01-26 20:21:42.325490585 +0000 UTC m=+0.067157639 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 20:21:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:21:43.155 104672 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'de:ea:f2', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '5e:85:94:26:02:b6'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 26 20:21:43 compute-0 nova_compute[183177]: 2026-01-26 20:21:43.156 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:43 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:21:43.157 104672 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 26 20:21:45 compute-0 nova_compute[183177]: 2026-01-26 20:21:45.393 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:46 compute-0 nova_compute[183177]: 2026-01-26 20:21:46.068 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:49 compute-0 sshd-session[222248]: Connection closed by authenticating user root 142.93.140.142 port 32808 [preauth]
Jan 26 20:21:50 compute-0 nova_compute[183177]: 2026-01-26 20:21:50.395 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:51 compute-0 nova_compute[183177]: 2026-01-26 20:21:51.070 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:51 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:21:51.158 104672 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=4b7fe4ab-0aa1-433c-a7da-fec1fee5732c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 26 20:21:55 compute-0 nova_compute[183177]: 2026-01-26 20:21:55.397 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:56 compute-0 nova_compute[183177]: 2026-01-26 20:21:56.109 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:21:59 compute-0 podman[192499]: time="2026-01-26T20:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:21:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:21:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2204 "" "Go-http-client/1.1"
Jan 26 20:22:00 compute-0 nova_compute[183177]: 2026-01-26 20:22:00.400 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:01 compute-0 nova_compute[183177]: 2026-01-26 20:22:01.113 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:01 compute-0 openstack_network_exporter[195363]: ERROR   20:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:22:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:22:01 compute-0 openstack_network_exporter[195363]: ERROR   20:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:22:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:22:01 compute-0 anacron[30974]: Job `cron.monthly' started
Jan 26 20:22:01 compute-0 anacron[30974]: Job `cron.monthly' terminated
Jan 26 20:22:01 compute-0 anacron[30974]: Normal exit (3 jobs run)
Jan 26 20:22:04 compute-0 sshd-session[222253]: Connection closed by authenticating user root 188.166.116.149 port 54242 [preauth]
Jan 26 20:22:05 compute-0 nova_compute[183177]: 2026-01-26 20:22:05.404 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:06 compute-0 nova_compute[183177]: 2026-01-26 20:22:06.115 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:08 compute-0 podman[222263]: 2026-01-26 20:22:08.3300542 +0000 UTC m=+0.058256939 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Jan 26 20:22:08 compute-0 podman[222257]: 2026-01-26 20:22:08.331304303 +0000 UTC m=+0.073275533 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Jan 26 20:22:08 compute-0 podman[222256]: 2026-01-26 20:22:08.372508763 +0000 UTC m=+0.120919247 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 26 20:22:10 compute-0 nova_compute[183177]: 2026-01-26 20:22:10.406 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:11 compute-0 nova_compute[183177]: 2026-01-26 20:22:11.118 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:13 compute-0 podman[222317]: 2026-01-26 20:22:13.351115259 +0000 UTC m=+0.099881451 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 20:22:14 compute-0 nova_compute[183177]: 2026-01-26 20:22:14.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:22:15 compute-0 nova_compute[183177]: 2026-01-26 20:22:15.408 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:16 compute-0 nova_compute[183177]: 2026-01-26 20:22:16.140 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:20 compute-0 nova_compute[183177]: 2026-01-26 20:22:20.411 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:21 compute-0 nova_compute[183177]: 2026-01-26 20:22:21.142 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:22:24.127 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:22:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:22:24.128 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:22:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:22:24.128 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:22:25 compute-0 nova_compute[183177]: 2026-01-26 20:22:25.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:22:25 compute-0 sshd-session[222342]: Connection closed by authenticating user root 142.93.140.142 port 44198 [preauth]
Jan 26 20:22:25 compute-0 nova_compute[183177]: 2026-01-26 20:22:25.413 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:26 compute-0 nova_compute[183177]: 2026-01-26 20:22:26.144 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:26 compute-0 nova_compute[183177]: 2026-01-26 20:22:26.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:22:26 compute-0 nova_compute[183177]: 2026-01-26 20:22:26.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:22:26 compute-0 nova_compute[183177]: 2026-01-26 20:22:26.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:22:26 compute-0 nova_compute[183177]: 2026-01-26 20:22:26.669 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:22:26 compute-0 nova_compute[183177]: 2026-01-26 20:22:26.669 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:22:26 compute-0 nova_compute[183177]: 2026-01-26 20:22:26.826 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:22:26 compute-0 nova_compute[183177]: 2026-01-26 20:22:26.827 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:22:26 compute-0 nova_compute[183177]: 2026-01-26 20:22:26.854 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:22:26 compute-0 nova_compute[183177]: 2026-01-26 20:22:26.855 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5734MB free_disk=73.08957290649414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:22:26 compute-0 nova_compute[183177]: 2026-01-26 20:22:26.855 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:22:26 compute-0 nova_compute[183177]: 2026-01-26 20:22:26.856 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:22:27 compute-0 nova_compute[183177]: 2026-01-26 20:22:27.924 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:22:27 compute-0 nova_compute[183177]: 2026-01-26 20:22:27.925 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:22:26 up  1:46,  0 user,  load average: 0.12, 0.33, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:22:27 compute-0 nova_compute[183177]: 2026-01-26 20:22:27.951 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:22:28 compute-0 nova_compute[183177]: 2026-01-26 20:22:28.748 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:22:29 compute-0 nova_compute[183177]: 2026-01-26 20:22:29.259 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:22:29 compute-0 nova_compute[183177]: 2026-01-26 20:22:29.260 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.404s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:22:29 compute-0 podman[192499]: time="2026-01-26T20:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:22:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:22:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2207 "" "Go-http-client/1.1"
Jan 26 20:22:30 compute-0 nova_compute[183177]: 2026-01-26 20:22:30.419 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:31 compute-0 nova_compute[183177]: 2026-01-26 20:22:31.150 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:31 compute-0 nova_compute[183177]: 2026-01-26 20:22:31.261 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:22:31 compute-0 nova_compute[183177]: 2026-01-26 20:22:31.262 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:22:31 compute-0 nova_compute[183177]: 2026-01-26 20:22:31.262 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:22:31 compute-0 openstack_network_exporter[195363]: ERROR   20:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:22:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:22:31 compute-0 openstack_network_exporter[195363]: ERROR   20:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:22:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:22:32 compute-0 nova_compute[183177]: 2026-01-26 20:22:32.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:22:32 compute-0 nova_compute[183177]: 2026-01-26 20:22:32.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:22:35 compute-0 nova_compute[183177]: 2026-01-26 20:22:35.422 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:36 compute-0 nova_compute[183177]: 2026-01-26 20:22:36.152 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:39 compute-0 nova_compute[183177]: 2026-01-26 20:22:39.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:22:39 compute-0 podman[222346]: 2026-01-26 20:22:39.330107626 +0000 UTC m=+0.069431710 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Jan 26 20:22:39 compute-0 podman[222347]: 2026-01-26 20:22:39.340537707 +0000 UTC m=+0.085675268 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 26 20:22:39 compute-0 podman[222345]: 2026-01-26 20:22:39.367254857 +0000 UTC m=+0.116085007 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 26 20:22:40 compute-0 nova_compute[183177]: 2026-01-26 20:22:40.424 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:41 compute-0 nova_compute[183177]: 2026-01-26 20:22:41.195 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:42 compute-0 nova_compute[183177]: 2026-01-26 20:22:42.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:22:42 compute-0 sshd-session[222408]: Connection closed by authenticating user root 188.166.116.149 port 53690 [preauth]
Jan 26 20:22:44 compute-0 podman[222410]: 2026-01-26 20:22:44.302612588 +0000 UTC m=+0.060835539 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 20:22:45 compute-0 nova_compute[183177]: 2026-01-26 20:22:45.428 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:46 compute-0 nova_compute[183177]: 2026-01-26 20:22:46.197 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:50 compute-0 nova_compute[183177]: 2026-01-26 20:22:50.430 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:51 compute-0 nova_compute[183177]: 2026-01-26 20:22:51.200 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:55 compute-0 nova_compute[183177]: 2026-01-26 20:22:55.432 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:56 compute-0 nova_compute[183177]: 2026-01-26 20:22:56.202 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:22:59 compute-0 podman[192499]: time="2026-01-26T20:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:22:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:22:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2205 "" "Go-http-client/1.1"
Jan 26 20:22:59 compute-0 sshd-session[222435]: Connection closed by authenticating user root 142.93.140.142 port 44462 [preauth]
Jan 26 20:23:00 compute-0 nova_compute[183177]: 2026-01-26 20:23:00.436 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:01 compute-0 nova_compute[183177]: 2026-01-26 20:23:01.246 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:01 compute-0 openstack_network_exporter[195363]: ERROR   20:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:23:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:23:01 compute-0 openstack_network_exporter[195363]: ERROR   20:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:23:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:23:05 compute-0 nova_compute[183177]: 2026-01-26 20:23:05.439 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:06 compute-0 nova_compute[183177]: 2026-01-26 20:23:06.247 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:10 compute-0 podman[222439]: 2026-01-26 20:23:10.363891472 +0000 UTC m=+0.091646459 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Jan 26 20:23:10 compute-0 podman[222438]: 2026-01-26 20:23:10.366511343 +0000 UTC m=+0.097266010 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=openstack_network_exporter)
Jan 26 20:23:10 compute-0 podman[222437]: 2026-01-26 20:23:10.406429897 +0000 UTC m=+0.144289806 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 26 20:23:10 compute-0 nova_compute[183177]: 2026-01-26 20:23:10.440 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:11 compute-0 nova_compute[183177]: 2026-01-26 20:23:11.249 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:14 compute-0 nova_compute[183177]: 2026-01-26 20:23:14.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:23:15 compute-0 podman[222499]: 2026-01-26 20:23:15.298058363 +0000 UTC m=+0.051007875 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 26 20:23:15 compute-0 nova_compute[183177]: 2026-01-26 20:23:15.444 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:16 compute-0 nova_compute[183177]: 2026-01-26 20:23:16.252 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:16 compute-0 sshd-session[222524]: Invalid user hadoop from 193.32.162.151 port 42868
Jan 26 20:23:17 compute-0 sshd-session[222524]: Connection closed by invalid user hadoop 193.32.162.151 port 42868 [preauth]
Jan 26 20:23:19 compute-0 nova_compute[183177]: 2026-01-26 20:23:19.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:23:19 compute-0 nova_compute[183177]: 2026-01-26 20:23:19.154 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 26 20:23:20 compute-0 sshd-session[222526]: Connection closed by authenticating user root 188.166.116.149 port 34212 [preauth]
Jan 26 20:23:20 compute-0 nova_compute[183177]: 2026-01-26 20:23:20.446 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:21 compute-0 nova_compute[183177]: 2026-01-26 20:23:21.255 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:21 compute-0 nova_compute[183177]: 2026-01-26 20:23:21.663 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:23:21 compute-0 nova_compute[183177]: 2026-01-26 20:23:21.663 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 26 20:23:22 compute-0 nova_compute[183177]: 2026-01-26 20:23:22.173 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 26 20:23:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:23:24.129 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:23:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:23:24.129 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:23:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:23:24.130 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:23:25 compute-0 nova_compute[183177]: 2026-01-26 20:23:25.448 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:25 compute-0 nova_compute[183177]: 2026-01-26 20:23:25.664 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:23:26 compute-0 nova_compute[183177]: 2026-01-26 20:23:26.255 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:27 compute-0 nova_compute[183177]: 2026-01-26 20:23:27.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:23:27 compute-0 nova_compute[183177]: 2026-01-26 20:23:27.673 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:23:27 compute-0 nova_compute[183177]: 2026-01-26 20:23:27.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:23:27 compute-0 nova_compute[183177]: 2026-01-26 20:23:27.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:23:27 compute-0 nova_compute[183177]: 2026-01-26 20:23:27.674 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:23:27 compute-0 nova_compute[183177]: 2026-01-26 20:23:27.880 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:23:27 compute-0 nova_compute[183177]: 2026-01-26 20:23:27.881 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:23:27 compute-0 nova_compute[183177]: 2026-01-26 20:23:27.903 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:23:27 compute-0 nova_compute[183177]: 2026-01-26 20:23:27.904 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5730MB free_disk=73.08980178833008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:23:27 compute-0 nova_compute[183177]: 2026-01-26 20:23:27.905 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:23:27 compute-0 nova_compute[183177]: 2026-01-26 20:23:27.906 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:23:28 compute-0 nova_compute[183177]: 2026-01-26 20:23:28.956 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:23:28 compute-0 nova_compute[183177]: 2026-01-26 20:23:28.956 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:23:27 up  1:47,  0 user,  load average: 0.04, 0.26, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:23:28 compute-0 nova_compute[183177]: 2026-01-26 20:23:28.980 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing inventories for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 26 20:23:29 compute-0 nova_compute[183177]: 2026-01-26 20:23:29.122 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating ProviderTree inventory for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 26 20:23:29 compute-0 nova_compute[183177]: 2026-01-26 20:23:29.123 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Updating inventory in ProviderTree for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 26 20:23:29 compute-0 nova_compute[183177]: 2026-01-26 20:23:29.140 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing aggregate associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 26 20:23:29 compute-0 nova_compute[183177]: 2026-01-26 20:23:29.175 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Refreshing trait associations for resource provider a47e311f-639f-4d60-b79d-85bbf53e2f35, traits: COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_EMULATED,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,COMPUTE_ARCH_X86_64,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_SCSI,HW_ARCH_X86_64,COMPUTE_SOUND_MODEL_SB16,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_SOUND_MODEL_AC97,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 26 20:23:29 compute-0 nova_compute[183177]: 2026-01-26 20:23:29.202 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:23:29 compute-0 nova_compute[183177]: 2026-01-26 20:23:29.716 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:23:29 compute-0 podman[192499]: time="2026-01-26T20:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:23:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:23:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2204 "" "Go-http-client/1.1"
Jan 26 20:23:30 compute-0 nova_compute[183177]: 2026-01-26 20:23:30.229 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:23:30 compute-0 nova_compute[183177]: 2026-01-26 20:23:30.229 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.324s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:23:30 compute-0 nova_compute[183177]: 2026-01-26 20:23:30.451 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:31 compute-0 nova_compute[183177]: 2026-01-26 20:23:31.257 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:31 compute-0 openstack_network_exporter[195363]: ERROR   20:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:23:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:23:31 compute-0 openstack_network_exporter[195363]: ERROR   20:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:23:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:23:33 compute-0 nova_compute[183177]: 2026-01-26 20:23:33.229 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:23:33 compute-0 nova_compute[183177]: 2026-01-26 20:23:33.230 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:23:33 compute-0 nova_compute[183177]: 2026-01-26 20:23:33.230 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:23:33 compute-0 nova_compute[183177]: 2026-01-26 20:23:33.230 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:23:34 compute-0 nova_compute[183177]: 2026-01-26 20:23:34.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:23:35 compute-0 nova_compute[183177]: 2026-01-26 20:23:35.455 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:35 compute-0 sshd-session[222530]: Connection closed by authenticating user root 142.93.140.142 port 44352 [preauth]
Jan 26 20:23:36 compute-0 nova_compute[183177]: 2026-01-26 20:23:36.259 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:40 compute-0 nova_compute[183177]: 2026-01-26 20:23:40.457 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:41 compute-0 nova_compute[183177]: 2026-01-26 20:23:41.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:23:41 compute-0 nova_compute[183177]: 2026-01-26 20:23:41.289 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:41 compute-0 podman[222533]: 2026-01-26 20:23:41.351850126 +0000 UTC m=+0.095863013 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 26 20:23:41 compute-0 podman[222534]: 2026-01-26 20:23:41.362251585 +0000 UTC m=+0.093036595 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2)
Jan 26 20:23:41 compute-0 podman[222532]: 2026-01-26 20:23:41.387969138 +0000 UTC m=+0.130944467 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Jan 26 20:23:45 compute-0 nova_compute[183177]: 2026-01-26 20:23:45.460 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:46 compute-0 nova_compute[183177]: 2026-01-26 20:23:46.289 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:46 compute-0 podman[222594]: 2026-01-26 20:23:46.334883661 +0000 UTC m=+0.083130880 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:23:48 compute-0 nova_compute[183177]: 2026-01-26 20:23:48.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:23:50 compute-0 nova_compute[183177]: 2026-01-26 20:23:50.461 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:51 compute-0 nova_compute[183177]: 2026-01-26 20:23:51.290 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:55 compute-0 nova_compute[183177]: 2026-01-26 20:23:55.464 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:56 compute-0 nova_compute[183177]: 2026-01-26 20:23:56.291 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:23:58 compute-0 sshd-session[222618]: Connection closed by authenticating user root 188.166.116.149 port 59966 [preauth]
Jan 26 20:23:59 compute-0 podman[192499]: time="2026-01-26T20:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:23:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:23:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2206 "" "Go-http-client/1.1"
Jan 26 20:24:00 compute-0 nova_compute[183177]: 2026-01-26 20:24:00.466 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:01 compute-0 nova_compute[183177]: 2026-01-26 20:24:01.293 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:01 compute-0 openstack_network_exporter[195363]: ERROR   20:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:24:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:24:01 compute-0 openstack_network_exporter[195363]: ERROR   20:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:24:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:24:05 compute-0 nova_compute[183177]: 2026-01-26 20:24:05.470 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:06 compute-0 nova_compute[183177]: 2026-01-26 20:24:06.295 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:10 compute-0 nova_compute[183177]: 2026-01-26 20:24:10.473 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:11 compute-0 nova_compute[183177]: 2026-01-26 20:24:11.297 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:11 compute-0 sshd-session[222620]: Connection closed by authenticating user root 142.93.140.142 port 46172 [preauth]
Jan 26 20:24:12 compute-0 podman[222623]: 2026-01-26 20:24:12.326494829 +0000 UTC m=+0.076918772 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 26 20:24:12 compute-0 podman[222624]: 2026-01-26 20:24:12.336391365 +0000 UTC m=+0.079488450 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 26 20:24:12 compute-0 podman[222622]: 2026-01-26 20:24:12.404346525 +0000 UTC m=+0.150414281 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 26 20:24:15 compute-0 nova_compute[183177]: 2026-01-26 20:24:15.474 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:16 compute-0 nova_compute[183177]: 2026-01-26 20:24:16.051 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:24:16 compute-0 nova_compute[183177]: 2026-01-26 20:24:16.298 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:17 compute-0 podman[222688]: 2026-01-26 20:24:17.320521729 +0000 UTC m=+0.068003572 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 20:24:20 compute-0 nova_compute[183177]: 2026-01-26 20:24:20.510 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:21 compute-0 nova_compute[183177]: 2026-01-26 20:24:21.301 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:24:24.131 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:24:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:24:24.131 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:24:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:24:24.131 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:24:25 compute-0 nova_compute[183177]: 2026-01-26 20:24:25.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:24:25 compute-0 nova_compute[183177]: 2026-01-26 20:24:25.512 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:26 compute-0 nova_compute[183177]: 2026-01-26 20:24:26.304 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:27 compute-0 nova_compute[183177]: 2026-01-26 20:24:27.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:24:27 compute-0 nova_compute[183177]: 2026-01-26 20:24:27.672 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:24:27 compute-0 nova_compute[183177]: 2026-01-26 20:24:27.673 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:24:27 compute-0 nova_compute[183177]: 2026-01-26 20:24:27.674 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:24:27 compute-0 nova_compute[183177]: 2026-01-26 20:24:27.674 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:24:27 compute-0 nova_compute[183177]: 2026-01-26 20:24:27.871 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:24:27 compute-0 nova_compute[183177]: 2026-01-26 20:24:27.873 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:24:27 compute-0 nova_compute[183177]: 2026-01-26 20:24:27.903 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:24:27 compute-0 nova_compute[183177]: 2026-01-26 20:24:27.904 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5729MB free_disk=73.08882141113281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:24:27 compute-0 nova_compute[183177]: 2026-01-26 20:24:27.904 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:24:27 compute-0 nova_compute[183177]: 2026-01-26 20:24:27.905 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:24:29 compute-0 nova_compute[183177]: 2026-01-26 20:24:29.085 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:24:29 compute-0 nova_compute[183177]: 2026-01-26 20:24:29.085 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:24:27 up  1:48,  0 user,  load average: 0.01, 0.21, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:24:29 compute-0 nova_compute[183177]: 2026-01-26 20:24:29.119 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:24:29 compute-0 nova_compute[183177]: 2026-01-26 20:24:29.626 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:24:29 compute-0 podman[192499]: time="2026-01-26T20:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:24:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:24:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2203 "" "Go-http-client/1.1"
Jan 26 20:24:30 compute-0 nova_compute[183177]: 2026-01-26 20:24:30.139 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:24:30 compute-0 nova_compute[183177]: 2026-01-26 20:24:30.139 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.235s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:24:30 compute-0 nova_compute[183177]: 2026-01-26 20:24:30.566 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:31 compute-0 nova_compute[183177]: 2026-01-26 20:24:31.306 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:31 compute-0 openstack_network_exporter[195363]: ERROR   20:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:24:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:24:31 compute-0 openstack_network_exporter[195363]: ERROR   20:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:24:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:24:34 compute-0 nova_compute[183177]: 2026-01-26 20:24:34.139 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:24:34 compute-0 nova_compute[183177]: 2026-01-26 20:24:34.140 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:24:34 compute-0 nova_compute[183177]: 2026-01-26 20:24:34.140 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:24:34 compute-0 nova_compute[183177]: 2026-01-26 20:24:34.140 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:24:35 compute-0 nova_compute[183177]: 2026-01-26 20:24:35.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:24:35 compute-0 nova_compute[183177]: 2026-01-26 20:24:35.569 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:36 compute-0 nova_compute[183177]: 2026-01-26 20:24:36.310 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:38 compute-0 sshd-session[222715]: Connection closed by authenticating user root 188.166.116.149 port 35956 [preauth]
Jan 26 20:24:40 compute-0 nova_compute[183177]: 2026-01-26 20:24:40.611 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:41 compute-0 nova_compute[183177]: 2026-01-26 20:24:41.313 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:43 compute-0 nova_compute[183177]: 2026-01-26 20:24:43.150 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:24:43 compute-0 podman[222718]: 2026-01-26 20:24:43.330137281 +0000 UTC m=+0.060729216 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Jan 26 20:24:43 compute-0 podman[222719]: 2026-01-26 20:24:43.35569069 +0000 UTC m=+0.073185763 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 26 20:24:43 compute-0 podman[222717]: 2026-01-26 20:24:43.420098994 +0000 UTC m=+0.151706696 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260120, io.buildah.version=1.41.4)
Jan 26 20:24:43 compute-0 nova_compute[183177]: 2026-01-26 20:24:43.662 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:24:45 compute-0 nova_compute[183177]: 2026-01-26 20:24:45.614 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:46 compute-0 nova_compute[183177]: 2026-01-26 20:24:46.315 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:47 compute-0 sshd-session[222782]: Connection closed by authenticating user root 142.93.140.142 port 51960 [preauth]
Jan 26 20:24:48 compute-0 podman[222784]: 2026-01-26 20:24:48.34600097 +0000 UTC m=+0.083363047 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 26 20:24:50 compute-0 nova_compute[183177]: 2026-01-26 20:24:50.660 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:51 compute-0 nova_compute[183177]: 2026-01-26 20:24:51.318 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:55 compute-0 nova_compute[183177]: 2026-01-26 20:24:55.696 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:56 compute-0 nova_compute[183177]: 2026-01-26 20:24:56.320 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:24:59 compute-0 podman[192499]: time="2026-01-26T20:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:24:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:24:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2201 "" "Go-http-client/1.1"
Jan 26 20:25:00 compute-0 nova_compute[183177]: 2026-01-26 20:25:00.738 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:01 compute-0 nova_compute[183177]: 2026-01-26 20:25:01.322 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:01 compute-0 openstack_network_exporter[195363]: ERROR   20:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:25:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:25:01 compute-0 openstack_network_exporter[195363]: ERROR   20:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:25:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:25:05 compute-0 nova_compute[183177]: 2026-01-26 20:25:05.740 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:06 compute-0 nova_compute[183177]: 2026-01-26 20:25:06.324 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:10 compute-0 nova_compute[183177]: 2026-01-26 20:25:10.785 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:11 compute-0 nova_compute[183177]: 2026-01-26 20:25:11.326 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:14 compute-0 podman[222816]: 2026-01-26 20:25:14.412439273 +0000 UTC m=+0.142800796 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260120, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 26 20:25:14 compute-0 podman[222810]: 2026-01-26 20:25:14.412595807 +0000 UTC m=+0.148466589 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64)
Jan 26 20:25:14 compute-0 podman[222809]: 2026-01-26 20:25:14.429598694 +0000 UTC m=+0.181869228 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true)
Jan 26 20:25:15 compute-0 nova_compute[183177]: 2026-01-26 20:25:15.826 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:16 compute-0 nova_compute[183177]: 2026-01-26 20:25:16.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:25:16 compute-0 nova_compute[183177]: 2026-01-26 20:25:16.329 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:18 compute-0 sshd-session[222874]: Connection closed by authenticating user root 188.166.116.149 port 52786 [preauth]
Jan 26 20:25:19 compute-0 podman[222876]: 2026-01-26 20:25:19.358028729 +0000 UTC m=+0.106235492 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 20:25:20 compute-0 nova_compute[183177]: 2026-01-26 20:25:20.858 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:21 compute-0 nova_compute[183177]: 2026-01-26 20:25:21.331 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:23 compute-0 sshd-session[222899]: Connection closed by authenticating user root 142.93.140.142 port 44396 [preauth]
Jan 26 20:25:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:25:24.132 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:25:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:25:24.133 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:25:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:25:24.133 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:25:25 compute-0 nova_compute[183177]: 2026-01-26 20:25:25.899 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:26 compute-0 nova_compute[183177]: 2026-01-26 20:25:26.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:25:26 compute-0 nova_compute[183177]: 2026-01-26 20:25:26.333 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:28 compute-0 nova_compute[183177]: 2026-01-26 20:25:28.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:25:28 compute-0 sshd-session[222902]: Invalid user hadoop from 193.32.162.151 port 48462
Jan 26 20:25:28 compute-0 sshd-session[222902]: Connection closed by invalid user hadoop 193.32.162.151 port 48462 [preauth]
Jan 26 20:25:28 compute-0 nova_compute[183177]: 2026-01-26 20:25:28.666 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:25:28 compute-0 nova_compute[183177]: 2026-01-26 20:25:28.667 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:25:28 compute-0 nova_compute[183177]: 2026-01-26 20:25:28.667 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:25:28 compute-0 nova_compute[183177]: 2026-01-26 20:25:28.668 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:25:28 compute-0 nova_compute[183177]: 2026-01-26 20:25:28.873 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:25:28 compute-0 nova_compute[183177]: 2026-01-26 20:25:28.875 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:25:28 compute-0 nova_compute[183177]: 2026-01-26 20:25:28.904 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:25:28 compute-0 nova_compute[183177]: 2026-01-26 20:25:28.906 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5725MB free_disk=73.08882141113281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:25:28 compute-0 nova_compute[183177]: 2026-01-26 20:25:28.906 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:25:28 compute-0 nova_compute[183177]: 2026-01-26 20:25:28.907 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:25:29 compute-0 podman[192499]: time="2026-01-26T20:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:25:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:25:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2201 "" "Go-http-client/1.1"
Jan 26 20:25:29 compute-0 nova_compute[183177]: 2026-01-26 20:25:29.959 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:25:29 compute-0 nova_compute[183177]: 2026-01-26 20:25:29.960 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:25:28 up  1:49,  0 user,  load average: 0.06, 0.19, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:25:29 compute-0 nova_compute[183177]: 2026-01-26 20:25:29.988 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:25:30 compute-0 nova_compute[183177]: 2026-01-26 20:25:30.495 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:25:30 compute-0 nova_compute[183177]: 2026-01-26 20:25:30.943 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:31 compute-0 nova_compute[183177]: 2026-01-26 20:25:31.021 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:25:31 compute-0 nova_compute[183177]: 2026-01-26 20:25:31.022 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:25:31 compute-0 nova_compute[183177]: 2026-01-26 20:25:31.336 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:31 compute-0 openstack_network_exporter[195363]: ERROR   20:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:25:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:25:31 compute-0 openstack_network_exporter[195363]: ERROR   20:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:25:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:25:35 compute-0 nova_compute[183177]: 2026-01-26 20:25:35.022 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:25:35 compute-0 nova_compute[183177]: 2026-01-26 20:25:35.022 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:25:35 compute-0 nova_compute[183177]: 2026-01-26 20:25:35.023 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:25:35 compute-0 nova_compute[183177]: 2026-01-26 20:25:35.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:25:35 compute-0 nova_compute[183177]: 2026-01-26 20:25:35.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:25:35 compute-0 nova_compute[183177]: 2026-01-26 20:25:35.944 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:36 compute-0 nova_compute[183177]: 2026-01-26 20:25:36.336 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:40 compute-0 nova_compute[183177]: 2026-01-26 20:25:40.977 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:41 compute-0 nova_compute[183177]: 2026-01-26 20:25:41.339 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:44 compute-0 nova_compute[183177]: 2026-01-26 20:25:44.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:25:45 compute-0 podman[222907]: 2026-01-26 20:25:45.35832982 +0000 UTC m=+0.080666624 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120, org.label-schema.schema-version=1.0)
Jan 26 20:25:45 compute-0 podman[222906]: 2026-01-26 20:25:45.364117666 +0000 UTC m=+0.093816058 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, release=1755695350, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.7, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git)
Jan 26 20:25:45 compute-0 podman[222905]: 2026-01-26 20:25:45.428785257 +0000 UTC m=+0.164968464 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260120)
Jan 26 20:25:46 compute-0 nova_compute[183177]: 2026-01-26 20:25:46.021 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:46 compute-0 nova_compute[183177]: 2026-01-26 20:25:46.340 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:50 compute-0 podman[222968]: 2026-01-26 20:25:50.338196498 +0000 UTC m=+0.078332970 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 26 20:25:51 compute-0 nova_compute[183177]: 2026-01-26 20:25:51.024 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:51 compute-0 nova_compute[183177]: 2026-01-26 20:25:51.342 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:56 compute-0 nova_compute[183177]: 2026-01-26 20:25:56.059 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:56 compute-0 nova_compute[183177]: 2026-01-26 20:25:56.343 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:25:59 compute-0 sshd-session[222994]: Connection closed by authenticating user root 188.166.116.149 port 45708 [preauth]
Jan 26 20:25:59 compute-0 sshd-session[222996]: Connection closed by authenticating user root 142.93.140.142 port 39026 [preauth]
Jan 26 20:25:59 compute-0 podman[192499]: time="2026-01-26T20:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:25:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:25:59 compute-0 podman[192499]: @ - - [26/Jan/2026:20:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2200 "" "Go-http-client/1.1"
Jan 26 20:26:01 compute-0 nova_compute[183177]: 2026-01-26 20:26:01.063 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:01 compute-0 nova_compute[183177]: 2026-01-26 20:26:01.345 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:01 compute-0 openstack_network_exporter[195363]: ERROR   20:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:26:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:26:01 compute-0 openstack_network_exporter[195363]: ERROR   20:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:26:01 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:26:06 compute-0 nova_compute[183177]: 2026-01-26 20:26:06.066 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:06 compute-0 nova_compute[183177]: 2026-01-26 20:26:06.347 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:11 compute-0 nova_compute[183177]: 2026-01-26 20:26:11.067 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:11 compute-0 nova_compute[183177]: 2026-01-26 20:26:11.349 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:16 compute-0 nova_compute[183177]: 2026-01-26 20:26:16.074 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:16 compute-0 podman[223000]: 2026-01-26 20:26:16.34061981 +0000 UTC m=+0.079616275 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260120, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 26 20:26:16 compute-0 podman[222999]: 2026-01-26 20:26:16.342898 +0000 UTC m=+0.079182472 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 26 20:26:16 compute-0 nova_compute[183177]: 2026-01-26 20:26:16.351 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:16 compute-0 podman[222998]: 2026-01-26 20:26:16.406671528 +0000 UTC m=+0.148866930 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260120, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 26 20:26:18 compute-0 nova_compute[183177]: 2026-01-26 20:26:18.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:26:21 compute-0 nova_compute[183177]: 2026-01-26 20:26:21.081 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:21 compute-0 podman[223059]: 2026-01-26 20:26:21.337270101 +0000 UTC m=+0.080851888 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 26 20:26:21 compute-0 nova_compute[183177]: 2026-01-26 20:26:21.353 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:26:24.134 104672 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:26:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:26:24.134 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:26:24 compute-0 ovn_metadata_agent[104667]: 2026-01-26 20:26:24.134 104672 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:26:26 compute-0 nova_compute[183177]: 2026-01-26 20:26:26.082 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:26 compute-0 nova_compute[183177]: 2026-01-26 20:26:26.355 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:28 compute-0 nova_compute[183177]: 2026-01-26 20:26:28.153 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:26:29 compute-0 podman[192499]: time="2026-01-26T20:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 26 20:26:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15350 "" "Go-http-client/1.1"
Jan 26 20:26:29 compute-0 podman[192499]: @ - - [26/Jan/2026:20:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2206 "" "Go-http-client/1.1"
Jan 26 20:26:30 compute-0 nova_compute[183177]: 2026-01-26 20:26:30.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:26:30 compute-0 nova_compute[183177]: 2026-01-26 20:26:30.667 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:26:30 compute-0 nova_compute[183177]: 2026-01-26 20:26:30.668 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:26:30 compute-0 nova_compute[183177]: 2026-01-26 20:26:30.668 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:26:30 compute-0 nova_compute[183177]: 2026-01-26 20:26:30.669 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 26 20:26:30 compute-0 nova_compute[183177]: 2026-01-26 20:26:30.846 183181 WARNING nova.virt.libvirt.driver [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 26 20:26:30 compute-0 nova_compute[183177]: 2026-01-26 20:26:30.847 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 26 20:26:30 compute-0 nova_compute[183177]: 2026-01-26 20:26:30.868 183181 DEBUG oslo_concurrency.processutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 26 20:26:30 compute-0 nova_compute[183177]: 2026-01-26 20:26:30.869 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5730MB free_disk=73.08882141113281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 26 20:26:30 compute-0 nova_compute[183177]: 2026-01-26 20:26:30.869 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 26 20:26:30 compute-0 nova_compute[183177]: 2026-01-26 20:26:30.870 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 26 20:26:31 compute-0 nova_compute[183177]: 2026-01-26 20:26:31.093 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:31 compute-0 nova_compute[183177]: 2026-01-26 20:26:31.358 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:31 compute-0 openstack_network_exporter[195363]: ERROR   20:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 26 20:26:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:26:31 compute-0 openstack_network_exporter[195363]: ERROR   20:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 26 20:26:31 compute-0 openstack_network_exporter[195363]: 
Jan 26 20:26:32 compute-0 nova_compute[183177]: 2026-01-26 20:26:32.789 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 26 20:26:32 compute-0 nova_compute[183177]: 2026-01-26 20:26:32.790 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 20:26:30 up  1:50,  0 user,  load average: 0.02, 0.15, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 26 20:26:32 compute-0 nova_compute[183177]: 2026-01-26 20:26:32.864 183181 DEBUG nova.compute.provider_tree [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed in ProviderTree for provider: a47e311f-639f-4d60-b79d-85bbf53e2f35 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 26 20:26:33 compute-0 nova_compute[183177]: 2026-01-26 20:26:33.374 183181 DEBUG nova.scheduler.client.report [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Inventory has not changed for provider a47e311f-639f-4d60-b79d-85bbf53e2f35 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 26 20:26:33 compute-0 nova_compute[183177]: 2026-01-26 20:26:33.886 183181 DEBUG nova.compute.resource_tracker [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 26 20:26:33 compute-0 nova_compute[183177]: 2026-01-26 20:26:33.886 183181 DEBUG oslo_concurrency.lockutils [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.016s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 26 20:26:35 compute-0 nova_compute[183177]: 2026-01-26 20:26:35.887 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:26:35 compute-0 nova_compute[183177]: 2026-01-26 20:26:35.887 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:26:35 compute-0 nova_compute[183177]: 2026-01-26 20:26:35.888 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:26:35 compute-0 nova_compute[183177]: 2026-01-26 20:26:35.888 183181 DEBUG nova.compute.manager [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 26 20:26:36 compute-0 nova_compute[183177]: 2026-01-26 20:26:36.097 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:36 compute-0 nova_compute[183177]: 2026-01-26 20:26:36.359 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:36 compute-0 sshd-session[223086]: Connection closed by authenticating user root 142.93.140.142 port 55712 [preauth]
Jan 26 20:26:37 compute-0 nova_compute[183177]: 2026-01-26 20:26:37.154 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:26:40 compute-0 sshd-session[223088]: Connection closed by authenticating user root 188.166.116.149 port 52904 [preauth]
Jan 26 20:26:41 compute-0 nova_compute[183177]: 2026-01-26 20:26:41.124 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:41 compute-0 nova_compute[183177]: 2026-01-26 20:26:41.361 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:43 compute-0 sshd-session[223090]: Accepted publickey for zuul from 192.168.122.10 port 49300 ssh2: ECDSA SHA256:I5PR7gCG3R+fLIKBzeyO8iqaNGSkk3Wt1QwfdEN+jyk
Jan 26 20:26:43 compute-0 systemd-logind[794]: New session 31 of user zuul.
Jan 26 20:26:43 compute-0 systemd[1]: Started Session 31 of User zuul.
Jan 26 20:26:43 compute-0 sshd-session[223090]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 26 20:26:43 compute-0 nova_compute[183177]: 2026-01-26 20:26:43.149 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:26:43 compute-0 sudo[223094]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 26 20:26:43 compute-0 sudo[223094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 26 20:26:46 compute-0 nova_compute[183177]: 2026-01-26 20:26:46.127 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:46 compute-0 nova_compute[183177]: 2026-01-26 20:26:46.152 183181 DEBUG oslo_service.periodic_task [None req-4cf82f4d-ec51-499c-820d-c442231e2e73 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 26 20:26:46 compute-0 nova_compute[183177]: 2026-01-26 20:26:46.362 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:47 compute-0 podman[223241]: 2026-01-26 20:26:47.349041602 +0000 UTC m=+0.084751483 container health_status c221410f9689f37e4453e381ed899df1a727d8efa195f8d0da075c4484e98ed6 (image=38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20260120, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 26 20:26:47 compute-0 podman[223240]: 2026-01-26 20:26:47.368566457 +0000 UTC m=+0.106631792 container health_status b8e954656d5c2fc2e6531684e593a84eb3bf45fa3cc1d3954bba475f293b837b (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 26 20:26:47 compute-0 podman[223239]: 2026-01-26 20:26:47.413503047 +0000 UTC m=+0.154932912 container health_status 790111bcb325ea4db0b4f90b767d82ed56e495171808a934fff97ed4af990172 (image=38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'bcd9a0f79dc04753bf98ddbb572d460f4ef39166f4ce4ba24868c39fdbd0b90e-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.223:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260120, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 26 20:26:47 compute-0 ovs-vsctl[223324]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 26 20:26:49 compute-0 virtqemud[182929]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 26 20:26:49 compute-0 virtqemud[182929]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 26 20:26:49 compute-0 virtqemud[182929]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 26 20:26:50 compute-0 crontab[223718]: (root) LIST (root)
Jan 26 20:26:51 compute-0 nova_compute[183177]: 2026-01-26 20:26:51.128 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:51 compute-0 nova_compute[183177]: 2026-01-26 20:26:51.366 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:52 compute-0 podman[223811]: 2026-01-26 20:26:52.337608405 +0000 UTC m=+0.076369906 container health_status 905ae020e1b92d97cc4ee66fb41577957fd3b6dadc6b96db7f4048fbbe42f2ba (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '9166a121f286568a7e86ff294722e67899fa43930fb90014de8efa9ad78a26c2-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 26 20:26:52 compute-0 systemd[1]: Starting Hostname Service...
Jan 26 20:26:52 compute-0 systemd[1]: Started Hostname Service.
Jan 26 20:26:56 compute-0 nova_compute[183177]: 2026-01-26 20:26:56.132 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 26 20:26:56 compute-0 nova_compute[183177]: 2026-01-26 20:26:56.367 183181 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
